I’ve pointed out before how I publish real-time location data to a browser. One of the context items I provide is “tails” to show the recent location history of the detected devices. In that case, the context was on the order of a minute or so to provide a sense of motion and activity. Another important aspect of context is to be able to see recent patterns of behavior. In this case, we need more context, on the order of an hour or more, but that can be a large amount of data. To handle this case efficiently, I set up a circular buffer representing a fixed time window in Redis. This way, clients can grab the whole context as needed (for example, at startup) or grab a portion to catch up since the last time the view was open. On the JavaScript side, we use a DataView to minimize the transmission size.
Client-server hexgrids
I like using hexgrids to summarize spatial statistics. Doing this in R is easy using stat_binhex
from ggplot2. There are also client-side binning examples using D3. But what I want are statistics aggregated on the server and simple rendering of those on the client side. To support this use case, I created a Rails model to define the grid and JavaScript to render the results in D3.
Real-time location visualization with D3
Mike Bostock is amazing. I use his D3 library for most of my web visualization work. The documentation is fantastic, the code is clean and concise and he’s created a great little ecosystem of examples on the D3 site and with bl.ocks.org.
I’ve been working on visualizing indoor location activity and one of the views is a live view of individual device locations. To provide some additional context, the view also provides the recent history path (or “tail”) for each device. Getting the path animation correct took a little trial and error. This post describes how I got points and paths to update and animate smoothly using D3.
Server-side Redis scripting
I really like Redis. Lately, I’ve been working on real-time spatial processing systems and Redis’ combination of distributed data structures and message queues fits that workload perfectly. Storm was another framework I evaluated, but it didn’t fit as well. Storm looks good if you have fixed processing pipelines, but my processing topology changes based on client-defined filters and preferences. I don’t get the automatic scaling and management that Storm provides, but I get a richer set of primitives to build upon and better support for dynamic pipelines.
One of the intriguing aspects in the beta version of Redis is the ability to define Lua scripts that run on the server. I’ll explore some of the scripts I generated and share what I think some of the benefits and challenges are to using them.
WebSockets with Netty
I first used Netty to build a load test driver to simulate thousands of mobile clients accessing REST services and responding to real-time push notifications over XMPP. These days, I’m using WebSockets more than XMPP and have revisited Netty to push updates asynchronously to multiple clients. I’ll show how I configured the Netty pipeline to support this scenario.