Interactive Volume Visualization

Direct volume rendering (DVR) is a visualization technique for 3D scalar data that can visualize the 3D structure of an entire data set by simulating the appearance of translucent gel. For the special case of volume data defined on rectilinear grids, it is possible to employ 3D graphics hardware to implement a volume renderer that can render large data sets at interactive rates. A simple accelerated renderer represents a data volume as stacks of texture-mapped rectangles aligned with one major axis of the data volume, whereas a more advanced technique uses 3D texture mapping capabilities of newer-generation graphics hardware to evaluate the classic ray casting volume rendering formula in a ray-parallel fashion. The latter method can achieve results that are very close in quality to the results of a software implementation of ray casting, but can be rendered several times a second for volume data about 512 x 512 x 512 voxels in size.

Figure 1: Direct volume rendering of the "electron density" of a C-60 Buckminsterfullerene molecule. The "electron density" was generated by placing radial Gaussian density functions at the atom positions of a theoretical Buckyball, and sampling the resulting function at 256 x 256 x 256 sites. The used color map highlights the "outer shell" of the molecule as a layer of cyan, and the atom nuclei and covalent bonds in solid orange.

Project Goals

The main project goals were to implement a set of reusable and extensible volume rendering classes with the following functionality: A tangential side project of the volume visualization project is to use better reconstruction filters when sampling the source 3D data (see Sampling Theory 101). Texture-based volume rendering using graphics hardware only offers trilinear interpolation of 3D data, which results in relatively low image quality even when rendering many texture-mapped slices. One way around this limitation is to supersample a given 3D data set using a high-quality reconstruction filter in a pre-processing step, and then render the higher-resolution data set using trilinear interpolation. Surprisingly, this simple approach yields noticable quality improvements while not necessarily increasing rendering time, as can be seen in the comparison screenshots.

Project Status

The current (as of 07/14/2004) version of the volume rendering classes offers a base class that encapsulates the memory management and slice generation algorithms for 2D and 3D texture mapping and can render volume data with a greyscale color ramp. An additional child class offers application of one-dimensional transfer functions (color maps) to volume data to create non-illuminated renderings as the one shown in Figure 1. As of 08/08/2006, the volume renderer classes and a test program based on the Vrui VR development toolkit are licensed under the GPL and available for download.

Pages In This Section

Screen Shots
Example images created by the interactive volume renderer.
Supersampling
Comparisons between normal and supersampled volume rendered images.
Download
Contains the complete source code for the texture-based volume rendering classes and an example program based on the Vrui VR development toolkit.