UC Davis' W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES), together with the UC Davis Tahoe Environmental Research Center, Lawrence Hall of Science, and ECHO Lake Aquarium and Science Center, is involved in an NSF-funded project on informal science education for freshwater lake and watershed science. As part of this project, we are primarily developing 3D visualization applications to teach earth science concepts, but we also built a hands-on exhibit combining a real sandbox, and virtual topography and water created using a closed loop of a Microsoft Kinect 3D camera, powerful simulation and visualization software, and a data projector. The resulting augmented reality (AR) sandbox allows users to create topography models by shaping real sand, which is then augmented in real time by an elevation color map, topographic contour lines, and simulated water. The system teaches geographic, geologic, and hydrologic concepts such as how to read a topography map, the meaning of contour lines, watersheds, catchment areas, levees, etc.
This project was inspired by a video created by a group of Czech researchers, who demonstrate an early prototype of an AR sandbox with elevation color mapping and some limited form of fluid flow. There is an even earlier project, Project Mimicry, of which we learned only later; it also appears to only be in early testing phase.
|Figure 1: The Augmented Reality Sandbox in its natural habitat. Left: Sandbox unit when turned off. The Kinect 3D camera and the digital projector are suspended above the sandbox proper from the pole attached to the back. Right: Sandbox table when turned on, showing a mountain with a crater lake, surrounded by several lower lakes.|
The goal of this project was to develop a real-time integrated augmented reality system to physically create topography models which are then scanned into a computer in real time, and used as background for a variety of graphics effects and simulations. The final product is supposed to be self-contained to the point where it can be used as a hands-on exhibit in science museums with little supervision.
Our AR Sandbox prototype was designed and built by project specialist Peter Gold of the UC Davis Department of Geology. The driving software is based on the Vrui VR development toolkit and the Kinect 3D video processing framework, and is available for download under the GNU General Public License.
Raw depth frames arrive from the Kinect camera at 30 frames per second and are fed into a statistical evaluation filter with a fixed configurable per-pixel buffer size (currently defaulting to 30 frames, corresponding to 1 second delay), which serves the triple purpose of filtering out moving objects such as hands or tools, reducing the noise inherent in the Kinect's depth data stream, and filling in missing data in the depth stream. The resulting topographic surface is then rendered from the point of view of the data projector suspended above the sandbox, with the effect that the projected topography exactly matches the real sand topography. The software uses a combination of several GLSL shaders to color the surface by elevation using customizable color maps (the default color map used right now was provided by M. Burak Yikilmaz, a post-doc in the UC Davis Department of Geology), and to add real-time topographic contour lines.
At the same time, a water flow simulation based on the Saint-Venant set of shallow water equations, which are a depth-integrated version of the set of Navier-Stokes equations governing fluid flow, is run in the background using another set of GLSL shaders. The simulation is an explicit second-order accurate time evolution of the hyperbolic system of partial differential equations, using the virtual sand surface as boundary conditions. The implementation of this method follows the paper "a second-order well-balanced positivity preserving central-upwind scheme for the Saint-Venant system" by A. Kurganov and G. Petrova, using a simple viscosity term, open boundary conditions at the edges of the grid, and a second-order strong stability-preserving Runge-Kutta temporal integration step. The simulation is run such that the water flows exactly at real speed assuming a 1:100 scale factor, unless turbulence in the flow forces too many integration steps for the driving graphics card (currently an Nvidia Geforce 780) to handle.
The AR Sandbox is slowly gaining traction, with many unaffiliated users building their own versions (see pictures or movies of a select few on the External Installations page). To support the budding AR Sandbox user community, we now have an official AR Sandbox support forum on the Lake Visualization 3D web site where users can help each other or provide feedback to us.
The AR Sandbox support forum also contains detailed step-by-step software installation instructions aimed at new Linux users, and a video showing the entire process.
On 01/23/2016, Joseph Kinyon, GIS manager with the Sonoma Land Trust, presented a portable AR Sandbox based on his own design in a segment on The New Screen Savers, a netcast on Leo Laporte's TWiT.tv network. I was invited along, and was able to provide a sneak peek into new features in the upcoming 2.0 release of the SARndbox software. Both the entire show (1:25 hours) and only the AR Sandbox segment (0:17 hours) can be viewed on YouTube.
In April 2016, the AR Sandbox went on a grand road show of sorts, making stops at the White House Water Summit, the USA Science and Engineering Festival, and the Coalition for National Science Funding's CNSF Exhibition & Reception, before reaching its final destination as a permanent classroom education tool at Howard University's Middle School of Mathematics and Science.
There is now a new web site dedicated to the AR Sandbox, ARSandbox.org.