In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorith...In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.展开更多
高分辨率地形高程和影像数据给交互式3维地形可视化应用带来沉重压力,主要体现在数据存储、调度传输及实时渲染等方面。该文设计一种基于提升小波变换与并行混合熵编码的地形数据高性能压缩方法,并结合图形处理器(Graphics Process Unit...高分辨率地形高程和影像数据给交互式3维地形可视化应用带来沉重压力,主要体现在数据存储、调度传输及实时渲染等方面。该文设计一种基于提升小波变换与并行混合熵编码的地形数据高性能压缩方法,并结合图形处理器(Graphics Process Unit,GPU)Ray-casting实现大规模3维地形可视化。首先建立多分辨率地形块的小波变换模型来映射其求精和化简操作;其次,基于提升小波变换分别构建格网数字高程模型(Digital Elevation Model,DEM)和地表纹理的多分辨率四叉树,对量化后的稀疏小波系数引入并行游程编码与并行变长霍夫曼编码相结合的混合熵编码进行压缩;将压缩数据组织成多序列层进码流进行实时解压渲染。在GPU上基于统一计算设备构架(Compute Unified Device Architecture,CUDA)实现该文的提升小波变换与混合熵编码。实验表明,在压缩比、信噪比与编解码的数据吞吐量综合指标方面,该文方法优于其它类似方法。实时渲染的高帧率满足了交互式可视化的要求。展开更多
文摘In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.