3D neuro-anatomical images and other volumetric data sets are important in many scientific and biomedical fields. Since such sets may be extremely large, a scalable compression method is critical to store, process and transmit them. To achieve a high compression rate, most of the existing volume compression methods are lossy, which is usually unacceptable in biomedical applications. Our near-lossless or lossless compression algorithm uses a Hilbert traversal to produce a data stream from the original image. This data stream enjoys relatively slow image context change, which helps the subsequent DPCM prediction to reduce the source entropy. An extremely fast linear DPCM is used; the prediction error is further encoded using Huffman code. In order to provide efficient data access, the source image is divided into blocks and indexed by an octree data structure. The Huffman coding overhead is effectively reduced using a novel binning algorithm. Our compression method is designed for perfo...
Rongkai Zhao, Michael Gabriel, Geneva G. Belford