Another example of a web dashboard built using R comes from John Oram, a scientist at the San Francisco Estuary Institute. SFEI collects and monitors environmental data from the waters and wetlands of the Bay Area. By sampling the waters, sediments and fish of the San Francisco Bay and testing for toxins, pollutants and other hazards, they monitor the health of the environment for Bay Area residents.
Making sense of such data could be a challenge: how do you translate random point-samples of pollutants taken over a long period of time into an overall picture of the environmental impact? John Oram has met this challenge in two ways: by linking the point data to a Google map of the region, where you can see the individual measurements in context; and by using kriging analysis and contour charts from R to extrapolate those samples to the entire Bay. Using his Web Query Tool you can select whether to review sediment, water, or fish data, choose the period over which the samples were collected to view, and select a specific toxin or pollutant to analyze. Here's a screenshot (click to enlarge) showing the distribution of all PCBs in the sediments of the bay based on samples from 2002-2007, but you can try it yourself and produce maps for other pollutants.
Amazingly, John created this application on basically zero budget: it uses all free tools including Google Maps, Rapache, and of course R. John said during the meeting that he couldn't have created this application without R: not only is it free, but the advanced statistical tools that analyze the data and even help select the sampling locations simply aren't available anywhere except in R.
As for the future, John is looking for other agencies that may be interested in using the application he's built to provide environmental analysis to everyone through a Web-based interface. I can certainly imagine the day when your local surf-safety bulletin or fishing recommendations come from a tool like John's.