madisonrightnow.com is a collection of near-real-time information about a metropolitan area. Traffic, weather, parking lot usage, and loads of web cam images are displayed on one page. Data come from a wide variety of sources: images come from cameras, weather data comes from a web API, and parking lot utilization come from good ol’ fashioned page scraping. All these transactions are triggered when a user wants to view the page. It is a challenge to get near-real-time information to the client from so many sources without a lot of pre-processing on the server end, and without the client having to make connections to a myriad of hosts to get images and other data. I found that asking the client to load all this makes the page too slow to load.
The solution is to build a server-side caching mechanism, so all the data and images are ready to go when the page is hit, and so that all transfers are between madisonrightnow.com and the client without having to wait on anybody else. This can cause the data served to be a little older, but is much better than asking the client to open connections to dozens of hosts and wait for every one to completely reply before the page can render. Under the caching scheme I developed, all data on the page are loaded from my server in order to give the user a smooth experience when expanding UI panels after the client renders it.