With the advent of the information security era,it is necessary to guarantee the privacy,accuracy,and dependable transfer of pictures.This study presents a new approach to the encryption and compression of color image...With the advent of the information security era,it is necessary to guarantee the privacy,accuracy,and dependable transfer of pictures.This study presents a new approach to the encryption and compression of color images.It is predicated on 2D compressed sensing(CS)and the hyperchaotic system.First,an optimized Arnold scrambling algorithm is applied to the initial color images to ensure strong security.Then,the processed images are con-currently encrypted and compressed using 2D CS.Among them,chaotic sequences replace traditional random measurement matrices to increase the system’s security.Third,the processed images are re-encrypted using a combination of permutation and diffusion algorithms.In addition,the 2D projected gradient with an embedding decryption(2DPG-ED)algorithm is used to reconstruct images.Compared with the traditional reconstruction algorithm,the 2DPG-ED algorithm can improve security and reduce computational complexity.Furthermore,it has better robustness.The experimental outcome and the performance analysis indicate that this algorithm can withstand malicious attacks and prove the method is effective.展开更多
Multispectral image compression and encryption algorithms commonly suffer from issues such as low compression efficiency,lack of synchronization between the compression and encryption proces-ses,and degradation of int...Multispectral image compression and encryption algorithms commonly suffer from issues such as low compression efficiency,lack of synchronization between the compression and encryption proces-ses,and degradation of intrinsic image structure.A novel approach is proposed to address these is-sues.Firstly,a chaotic sequence is generated using the Lorenz three-dimensional chaotic mapping to initiate the encryption process,which is XORed with each spectral band of the multispectral image to complete the initial encryption of the image.Then,a two-dimensional lifting 9/7 wavelet transform is applied to the processed image.Next,a key-sensitive Arnold scrambling technique is employed on the resulting low-frequency image.It effectively eliminates spatial redundancy in the multispectral image while enhancing the encryption process.To optimize the compression and encryption processes further,fast Tucker decomposition is applied to the wavelet sub-band tensor.It effectively removes both spectral redundancy and residual spatial redundancy in the multispectral image.Finally,the core tensor and pattern matrix obtained from the decomposition are subjected to entropy encoding,and real-time chaotic encryption is implemented during the encoding process,effectively integrating compression and encryption.The results show that the proposed algorithm is suitable for occasions with high requirements for compression and encryption,and it provides valuable insights for the de-velopment of compression and encryption in multispectral field.展开更多
This paper presents a novel method utilizing wavelets with particle swarm optimization(PSO)for medical image compression.Our method utilizes PSO to overcome the wavelets discontinuity which occurs when compressing ima...This paper presents a novel method utilizing wavelets with particle swarm optimization(PSO)for medical image compression.Our method utilizes PSO to overcome the wavelets discontinuity which occurs when compressing images using thresholding.It transfers images into subband details and approximations using a modified Haar wavelet(MHW),and then applies a threshold.PSO is applied for selecting a particle assigned to the threshold values for the subbands.Nine positions assigned to particles values are used to represent population.Every particle updates its position depending on the global best position(gbest)(for all details subband)and local best position(pbest)(for a subband).The fitness value is developed to terminate PSO when the difference between two local best(pbest)successors is smaller than a prescribe value.The experiments are applied on five different medical image types,i.e.,MRI,CT,and X-ray.Results show that the proposed algorithm can be more preferably to compress medical images than other existing wavelets techniques from peak signal to noise ratio(PSNR)and compression ratio(CR)points of views.展开更多
[Objective] The aim was to present a proposal about a new image compression technology, in order to make the image be able to be stored in a smaller space and be transmitted with smaller bit rate on the premise of gua...[Objective] The aim was to present a proposal about a new image compression technology, in order to make the image be able to be stored in a smaller space and be transmitted with smaller bit rate on the premise of guaranteeing image quality in the rape crop monitoring system in Qinling Mountains. [Method] In the proposal, the color image was divided into brightness images with three fundamental colors, followed by sub-image division and DCT treatment. Then, coefficients of transform domain were quantized, and encoded and compressed as per Huffman coding. Finally, decompression was conducted through inverse process and decompressed images were matched. [Result] The simulation results show that when compression ratio of the color image of rape crops was 11.972 3∶1, human can not distinguish the differences between the decompressed images and the source images with naked eyes; when ratio was as high as 53.565 6∶1, PSNR was still above 30 dD,encoding efficiency achieved over 0.78 and redundancy was less than 0.22. [Conclusion] The results indicate that the proposed color image compression technology can achieve higher compression ratio on the premise of good image quality. In addition, image encoding quality and decompressed images achieved better results, which fully met requirement of image storage and transmission in monitoring system of rape crop in the Qinling Mountains.展开更多
By investigating the limitation of existing wavelet tree based image compression methods, we propose a novel wavelet fractal image compression method in this paper. Briefly, the initial errors are appointed given the ...By investigating the limitation of existing wavelet tree based image compression methods, we propose a novel wavelet fractal image compression method in this paper. Briefly, the initial errors are appointed given the different levels of importance accorded the frequency sublevel band wavelet coefficients. Higher frequency sublevel bands would lead to larger initial errors. As a result, the sizes of sublevel blocks and super blocks would be changed according to the initial errors. The matching sizes between sublevel blocks and super blocks would be changed according to the permitted errors and compression rates. Systematic analyses are performed and the experimental results demonstrate that the proposed method provides a satisfactory performance with a clearly increasing rate of compression and speed of encoding without reducing SNR and the quality of decoded images. Simulation results show that our method is superior to the traditional wavelet tree based methods of fractal image compression.展开更多
Many classical encoding algorithms of vector quantization (VQ) of image compression that can obtain global optimal solution have computational complexity O(N). A pure quantum VQ encoding algorithm with probability...Many classical encoding algorithms of vector quantization (VQ) of image compression that can obtain global optimal solution have computational complexity O(N). A pure quantum VQ encoding algorithm with probability of success near 100% has been proposed, that performs operations 45√N times approximately. In this paper, a hybrid quantum VQ encoding algorithm between the classical method and the quantum algorithm is presented. The number of its operations is less than √N for most images, and it is more efficient than the pure quantum algorithm.展开更多
A nonlinear data analysis algorithm, namely empirical data decomposition (EDD) is proposed, which can perform adaptive analysis of observed data. Analysis filter, which is not a linear constant coefficient filter, i...A nonlinear data analysis algorithm, namely empirical data decomposition (EDD) is proposed, which can perform adaptive analysis of observed data. Analysis filter, which is not a linear constant coefficient filter, is automatically determined by observed data, and is able to implement multi-resolution analysis as wavelet transform. The algorithm is suitable for analyzing non-stationary data and can effectively wipe off the relevance of observed data. Then through discussing the applications of EDD in image compression, the paper presents a 2-dimension data decomposition framework and makes some modifications of contexts used by Embedded Block Coding with Optimized Truncation (EBCOT) . Simulation results show that EDD is more suitable for non-stationary image data compression.展开更多
A novel Bacterial Foraging Algorithm (BFA) based neural network is presented for image compression. To improve the quality of the decompressed images, the concepts of reproduction, elimination and dispersal in BFA are...A novel Bacterial Foraging Algorithm (BFA) based neural network is presented for image compression. To improve the quality of the decompressed images, the concepts of reproduction, elimination and dispersal in BFA are firstly introduced into neural network in the proposed algorithm. Extensive experiments are conducted on standard testing images and the results show that the pro- posed method can improve the quality of the reconstructed images significantly.展开更多
This paper utilizes a spatial texture correlation and the intelligent classification algorithm (ICA) search strategy to speed up the encoding process and improve the bit rate for fractal image compression. Texture f...This paper utilizes a spatial texture correlation and the intelligent classification algorithm (ICA) search strategy to speed up the encoding process and improve the bit rate for fractal image compression. Texture features is one of the most important properties for the representation of an image. Entropy and maximum entry from co-occurrence matrices are used for representing texture features in an image. For a range block, concerned domain blocks of neighbouring range blocks with similar texture features can be searched. In addition, domain blocks with similar texture features are searched in the ICA search process. Experiments show that in comparison with some typical methods, the proposed algorithm significantly speeds up the encoding process and achieves a higher compression ratio, with a slight diminution in the quality of the reconstructed image; in comparison with a spatial correlation scheme, the proposed scheme spends much less encoding time while the compression ratio and the quality of the reconstructed image are almost the same.展开更多
The paper presents a class of nonlinear adaptive wavelet transforms for lossless image compression. In update step of the lifting the different operators are chosen by the local gradient of original image. A nonlinear...The paper presents a class of nonlinear adaptive wavelet transforms for lossless image compression. In update step of the lifting the different operators are chosen by the local gradient of original image. A nonlinear morphological predictor follows the update adaptive lifting to result in fewer large wavelet coefficients near edges for reducing coding. The nonlinear adaptive wavelet transforms can also allow perfect reconstruction without any overhead cost. Experiment results are given to show lower entropy of the adaptive transformed images than those of the non-adaptive case and great applicable potentiality in lossless image compresslon.展开更多
A new method using plane fitting to decide whether a domain block is similar enough to a given range block is proposed in this paper. First, three coefficients are computed for describing each range and domain block. ...A new method using plane fitting to decide whether a domain block is similar enough to a given range block is proposed in this paper. First, three coefficients are computed for describing each range and domain block. Then, the best-matched one for every range block is obtained by analysing the relation between their coefficients. Experimental results show that the proposed method can shorten encoding time markedly, while the retrieved image quality is still acceptable. In the decoding step, a kind of simple line fitting on block boundaries is used to reduce blocking effects. At the same time, the proposed method can also achieve a high compression ratio.展开更多
In this paper, image compression and decompression are realized on a personal computer based on fractal theory. The algorithm is effectiveas as the reconstructed image is similar to the original. In the algorithm, the...In this paper, image compression and decompression are realized on a personal computer based on fractal theory. The algorithm is effectiveas as the reconstructed image is similar to the original. In the algorithm, the formulas for contrast scaling and luminance shift are simplified,and the Hausdorff distance is replaced by the Euclidean distance. Thus, the calculation load is reduced. The formula for compression ratio is presented for an ideal situation, from which one can analyze how the different factors influence image compression ratio.展开更多
In this letter, a new Linde-Buzo-Gray (LBG)-based image compression method using Discrete Cosine Transform (DCT) and Vector Quantization (VQ) is proposed. A gray-level image is firstly decomposed into blocks, then eac...In this letter, a new Linde-Buzo-Gray (LBG)-based image compression method using Discrete Cosine Transform (DCT) and Vector Quantization (VQ) is proposed. A gray-level image is firstly decomposed into blocks, then each block is subsequently encoded by a 2D DCT coding scheme. The dimension of vectors as the input of a generalized VQ scheme is reduced. The time of encoding by a generalized VQ is reduced with the introduction of DCT process. The experimental results demonstrate the efficiency of the proposed method.展开更多
Synthetic aperture radar (SAR) images are corrupted by multiplicative speckle noise which limits the performance of the classical coder/decoder algorithm in spatial domain. The relatively new transform of multiwavel...Synthetic aperture radar (SAR) images are corrupted by multiplicative speckle noise which limits the performance of the classical coder/decoder algorithm in spatial domain. The relatively new transform of multiwavelets can possess desirable features simultaneously, such as orthogonality and symmetry, while scalar wavelets cannot. In this paper we propose a compression scheme combining with speckle noise reduction within the multiwavelet framework. Compared with classical set partitioning in hierarchical trees (SPIHT) algorithm, our method achieves favorable peak signal to noise ratio (PSNR) and superior speckle noise reduction performances.展开更多
A new image compression algorithm is proposed based on local visual activity classification and the investigation of the histograms of the small non-overlapping blocks of the differential and angle image. Histograms o...A new image compression algorithm is proposed based on local visual activity classification and the investigation of the histograms of the small non-overlapping blocks of the differential and angle image. Histograms of the differential blocks are classified according to their visual activities as Unimodal, Bimodal and Multimodal blocks. According to the histogram shape of the differential block and the mean angle of the same block an optimized quantization table with special coding is applied by taking the advantages of the local visual activities within the block. A considerable compression ratio and visual output improvement compared with the DCT compression algorithm are gained.展开更多
The amount of image data generated in multimedia applications is ever increasing. The image compression plays vital role in multimedia applications. The ultimate aim of image compression is to reduce storage space wit...The amount of image data generated in multimedia applications is ever increasing. The image compression plays vital role in multimedia applications. The ultimate aim of image compression is to reduce storage space without degrading image quality. Compression is required whenever the data handled is huge they may be required to sent or transmitted and also stored. The New Edge Directed Interpolation (NEDI)-based lifting Discrete Wavelet Transfrom (DWT) scheme with modified Set Partitioning In Hierarchical Trees (MSPIHT) algorithm is proposed in this paper. The NEDI algorithm gives good visual quality image particularly at edges. The main objective of this paper is to be preserving the edges while performing image compression which is a challenging task. The NEDI with lifting DWT has achieved 99.18% energy level in the low frequency ranges which has 1.07% higher than 5/3 Wavelet decomposition and 0.94% higher than traditional DWT. To implement this NEDI with Lifting DWT along with MSPIHT algorithm which gives higher Peak Signal to Noise Ratio (PSNR) value and minimum Mean Square Error (MSE) and hence better image quality. The experimental results proved that the proposed method gives better PSNR value (39.40 dB for rate 0.9 bpp without arithmetic coding) and minimum MSE value is 7.4.展开更多
In this paper, three techniques, line run coding, quadtree DF (Depth-First) representation and H coding for compressing classified satellite cloud images with no distortion are presented. In these three codings, the f...In this paper, three techniques, line run coding, quadtree DF (Depth-First) representation and H coding for compressing classified satellite cloud images with no distortion are presented. In these three codings, the first two were invented by other persons and the third one, by ourselves. As a result, the comparison among their compression rates is. given at the end of this paper. Further application of these image compression technique to satellite data and other meteorological data looks promising.展开更多
be stored or transmitted in an efficient form.In this work,a new idea is proposed,where we take advantage of the redundancy that appears in a group of images to be all compressed together,instead of compressing each i...be stored or transmitted in an efficient form.In this work,a new idea is proposed,where we take advantage of the redundancy that appears in a group of images to be all compressed together,instead of compressing each image by itself.In our proposed technique,a classification process is applied,where the set of the input images are classified into groups based on existing technique like L1 and L2 norms,color histograms.All images that belong to the same group are compressed based on dividing the images of the same group into sub-images of equal sizes and saving the references into a codebook.In the process of extracting the different sub-images,we used the mean squared error for comparison and three blurring methods(simple,middle and majority blurring)to increase the compression ratio.Experiments show that varying blurring values,as well as MSE thresholds,enhanced the compression results in a group of images compared to JPEG and PNG compressors.展开更多
When an image, which is decomposed by bi-orthogonal wavelet bases, is reconstructed, some information will be lost at the four edges of the image. At the same time, artificial discontinuities will be introduced. We us...When an image, which is decomposed by bi-orthogonal wavelet bases, is reconstructed, some information will be lost at the four edges of the image. At the same time, artificial discontinuities will be introduced. We use a method called symmetric extension to solve the problem. We only consider the case of the two-band filter banks, and the results can be applied to M-band filter banks. There are only two types of symmetric extension in analysis phrase, namely the whole-sample symmetry (WS), the half-sample symmetry (HS), while there are four types of symmetric extension in synthesis phrase, namely the WS, HS, the whole-sample anti-symmetry (WA), and the half-sample anti-symmetry (HA) respectively. We can select the exact type according to the image length and the filter length, and we will show how to do these. The image can be perfectly reconstructed without any edge effects in this way. Finally, simulation results are reported. Key words edge effect - image compression - wavelet - biorthogonal bases - symmetric extension CLC number TP 37 Foundation item: Supported by the National 863 Project (20021111901010)Biography: Yu Sheng-sheng (1944-), male, Professor, research direction: multimedia information processing, SAN.展开更多
This paper proposes an efficient lossless image compression scheme for still images based on an adaptive arithmetic coding compression algorithm. The algorithm increases the image coding compression rate and ensures t...This paper proposes an efficient lossless image compression scheme for still images based on an adaptive arithmetic coding compression algorithm. The algorithm increases the image coding compression rate and ensures the quality of the decoded image combined with the adaptive probability model and predictive coding. The use of adaptive models for each encoded image block dynamically estimates the probability of the relevant image block. The decoded image block can accurately recover the encoded image according to the code book information. We adopt an adaptive arithmetic coding algorithm for image compression that greatly improves the image compression rate. The results show that it is an effective compression technology.展开更多
基金This work was supported in part by the National Natural Science Foundation of China under Grants 71571091,71771112the State Key Laboratory of Synthetical Automation for Process Industries Fundamental Research Funds under Grant PAL-N201801the Excellent Talent Training Project of University of Science and Technology Liaoning under Grant 2019RC05.
文摘With the advent of the information security era,it is necessary to guarantee the privacy,accuracy,and dependable transfer of pictures.This study presents a new approach to the encryption and compression of color images.It is predicated on 2D compressed sensing(CS)and the hyperchaotic system.First,an optimized Arnold scrambling algorithm is applied to the initial color images to ensure strong security.Then,the processed images are con-currently encrypted and compressed using 2D CS.Among them,chaotic sequences replace traditional random measurement matrices to increase the system’s security.Third,the processed images are re-encrypted using a combination of permutation and diffusion algorithms.In addition,the 2D projected gradient with an embedding decryption(2DPG-ED)algorithm is used to reconstruct images.Compared with the traditional reconstruction algorithm,the 2DPG-ED algorithm can improve security and reduce computational complexity.Furthermore,it has better robustness.The experimental outcome and the performance analysis indicate that this algorithm can withstand malicious attacks and prove the method is effective.
基金the National Natural Science Foundation of China(No.11803036)Climbing Program of Changchun University(No.ZKP202114).
文摘Multispectral image compression and encryption algorithms commonly suffer from issues such as low compression efficiency,lack of synchronization between the compression and encryption proces-ses,and degradation of intrinsic image structure.A novel approach is proposed to address these is-sues.Firstly,a chaotic sequence is generated using the Lorenz three-dimensional chaotic mapping to initiate the encryption process,which is XORed with each spectral band of the multispectral image to complete the initial encryption of the image.Then,a two-dimensional lifting 9/7 wavelet transform is applied to the processed image.Next,a key-sensitive Arnold scrambling technique is employed on the resulting low-frequency image.It effectively eliminates spatial redundancy in the multispectral image while enhancing the encryption process.To optimize the compression and encryption processes further,fast Tucker decomposition is applied to the wavelet sub-band tensor.It effectively removes both spectral redundancy and residual spatial redundancy in the multispectral image.Finally,the core tensor and pattern matrix obtained from the decomposition are subjected to entropy encoding,and real-time chaotic encryption is implemented during the encoding process,effectively integrating compression and encryption.The results show that the proposed algorithm is suitable for occasions with high requirements for compression and encryption,and it provides valuable insights for the de-velopment of compression and encryption in multispectral field.
基金funded by the University of Jeddah,Saudi Arabia,under Grant No.UJ-20-043-DR。
文摘This paper presents a novel method utilizing wavelets with particle swarm optimization(PSO)for medical image compression.Our method utilizes PSO to overcome the wavelets discontinuity which occurs when compressing images using thresholding.It transfers images into subband details and approximations using a modified Haar wavelet(MHW),and then applies a threshold.PSO is applied for selecting a particle assigned to the threshold values for the subbands.Nine positions assigned to particles values are used to represent population.Every particle updates its position depending on the global best position(gbest)(for all details subband)and local best position(pbest)(for a subband).The fitness value is developed to terminate PSO when the difference between two local best(pbest)successors is smaller than a prescribe value.The experiments are applied on five different medical image types,i.e.,MRI,CT,and X-ray.Results show that the proposed algorithm can be more preferably to compress medical images than other existing wavelets techniques from peak signal to noise ratio(PSNR)and compression ratio(CR)points of views.
基金Supported by Special Fund for Scientific Research of Shannxi Education Department(No:2010JK463)Shaanxi Natural Science Foundation(2011JE012)~~
文摘[Objective] The aim was to present a proposal about a new image compression technology, in order to make the image be able to be stored in a smaller space and be transmitted with smaller bit rate on the premise of guaranteeing image quality in the rape crop monitoring system in Qinling Mountains. [Method] In the proposal, the color image was divided into brightness images with three fundamental colors, followed by sub-image division and DCT treatment. Then, coefficients of transform domain were quantized, and encoded and compressed as per Huffman coding. Finally, decompression was conducted through inverse process and decompressed images were matched. [Result] The simulation results show that when compression ratio of the color image of rape crops was 11.972 3∶1, human can not distinguish the differences between the decompressed images and the source images with naked eyes; when ratio was as high as 53.565 6∶1, PSNR was still above 30 dD,encoding efficiency achieved over 0.78 and redundancy was less than 0.22. [Conclusion] The results indicate that the proposed color image compression technology can achieve higher compression ratio on the premise of good image quality. In addition, image encoding quality and decompressed images achieved better results, which fully met requirement of image storage and transmission in monitoring system of rape crop in the Qinling Mountains.
基金Project 60571049 supported by the National Natural Science Foundation of China
文摘By investigating the limitation of existing wavelet tree based image compression methods, we propose a novel wavelet fractal image compression method in this paper. Briefly, the initial errors are appointed given the different levels of importance accorded the frequency sublevel band wavelet coefficients. Higher frequency sublevel bands would lead to larger initial errors. As a result, the sizes of sublevel blocks and super blocks would be changed according to the initial errors. The matching sizes between sublevel blocks and super blocks would be changed according to the permitted errors and compression rates. Systematic analyses are performed and the experimental results demonstrate that the proposed method provides a satisfactory performance with a clearly increasing rate of compression and speed of encoding without reducing SNR and the quality of decoded images. Simulation results show that our method is superior to the traditional wavelet tree based methods of fractal image compression.
文摘Many classical encoding algorithms of vector quantization (VQ) of image compression that can obtain global optimal solution have computational complexity O(N). A pure quantum VQ encoding algorithm with probability of success near 100% has been proposed, that performs operations 45√N times approximately. In this paper, a hybrid quantum VQ encoding algorithm between the classical method and the quantum algorithm is presented. The number of its operations is less than √N for most images, and it is more efficient than the pure quantum algorithm.
基金This project was supported by the National Natural Science Foundation of China (60532060)Hainan Education Bureau Research Project (Hjkj200602)Hainan Natural Science Foundation (80551).
文摘A nonlinear data analysis algorithm, namely empirical data decomposition (EDD) is proposed, which can perform adaptive analysis of observed data. Analysis filter, which is not a linear constant coefficient filter, is automatically determined by observed data, and is able to implement multi-resolution analysis as wavelet transform. The algorithm is suitable for analyzing non-stationary data and can effectively wipe off the relevance of observed data. Then through discussing the applications of EDD in image compression, the paper presents a 2-dimension data decomposition framework and makes some modifications of contexts used by Embedded Block Coding with Optimized Truncation (EBCOT) . Simulation results show that EDD is more suitable for non-stationary image data compression.
基金Supported by the National Natural Science Foundation of China (No.60572100)by the Royal Society (U.K.) International Joint Projects 2006/R3-Cost Share with NSFC (No.60711130233)
文摘A novel Bacterial Foraging Algorithm (BFA) based neural network is presented for image compression. To improve the quality of the decompressed images, the concepts of reproduction, elimination and dispersal in BFA are firstly introduced into neural network in the proposed algorithm. Extensive experiments are conducted on standard testing images and the results show that the pro- posed method can improve the quality of the reconstructed images significantly.
基金supported by the National Natural Science Foundation of China (Grant Nos. 60573172 and 60973152)the Superior University Doctor Subject Special Scientific Research Foundation of China (Grant No. 20070141014)the Natural Science Foundation of Liaoning Province of China (Grant No. 20082165)
文摘This paper utilizes a spatial texture correlation and the intelligent classification algorithm (ICA) search strategy to speed up the encoding process and improve the bit rate for fractal image compression. Texture features is one of the most important properties for the representation of an image. Entropy and maximum entry from co-occurrence matrices are used for representing texture features in an image. For a range block, concerned domain blocks of neighbouring range blocks with similar texture features can be searched. In addition, domain blocks with similar texture features are searched in the ICA search process. Experiments show that in comparison with some typical methods, the proposed algorithm significantly speeds up the encoding process and achieves a higher compression ratio, with a slight diminution in the quality of the reconstructed image; in comparison with a spatial correlation scheme, the proposed scheme spends much less encoding time while the compression ratio and the quality of the reconstructed image are almost the same.
基金Supported by the National Natural Science Foundation of China (69983005)
文摘The paper presents a class of nonlinear adaptive wavelet transforms for lossless image compression. In update step of the lifting the different operators are chosen by the local gradient of original image. A nonlinear morphological predictor follows the update adaptive lifting to result in fewer large wavelet coefficients near edges for reducing coding. The nonlinear adaptive wavelet transforms can also allow perfect reconstruction without any overhead cost. Experiment results are given to show lower entropy of the adaptive transformed images than those of the non-adaptive case and great applicable potentiality in lossless image compresslon.
基金Project supported by the National Natural Science Foundation of China (Grant Nos. 61173183, 60973152, and 60573172)the Special Scientific Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20070141014)the Natural Science Foundation of Liaoning Province, China (Grant No. 20082165)
文摘A new method using plane fitting to decide whether a domain block is similar enough to a given range block is proposed in this paper. First, three coefficients are computed for describing each range and domain block. Then, the best-matched one for every range block is obtained by analysing the relation between their coefficients. Experimental results show that the proposed method can shorten encoding time markedly, while the retrieved image quality is still acceptable. In the decoding step, a kind of simple line fitting on block boundaries is used to reduce blocking effects. At the same time, the proposed method can also achieve a high compression ratio.
文摘In this paper, image compression and decompression are realized on a personal computer based on fractal theory. The algorithm is effectiveas as the reconstructed image is similar to the original. In the algorithm, the formulas for contrast scaling and luminance shift are simplified,and the Hausdorff distance is replaced by the Euclidean distance. Thus, the calculation load is reduced. The formula for compression ratio is presented for an ideal situation, from which one can analyze how the different factors influence image compression ratio.
基金Partially supported by the National Natural Science Foundation of China (No.60572100), Foundation of State Key Laboratory of Networking and Switching Technology (China) and Science Foundation of Shenzhen City (200408).
文摘In this letter, a new Linde-Buzo-Gray (LBG)-based image compression method using Discrete Cosine Transform (DCT) and Vector Quantization (VQ) is proposed. A gray-level image is firstly decomposed into blocks, then each block is subsequently encoded by a 2D DCT coding scheme. The dimension of vectors as the input of a generalized VQ scheme is reduced. The time of encoding by a generalized VQ is reduced with the introduction of DCT process. The experimental results demonstrate the efficiency of the proposed method.
基金This work was supported by the National Natural Science Foundation of China under Grant No. 60472048.
文摘Synthetic aperture radar (SAR) images are corrupted by multiplicative speckle noise which limits the performance of the classical coder/decoder algorithm in spatial domain. The relatively new transform of multiwavelets can possess desirable features simultaneously, such as orthogonality and symmetry, while scalar wavelets cannot. In this paper we propose a compression scheme combining with speckle noise reduction within the multiwavelet framework. Compared with classical set partitioning in hierarchical trees (SPIHT) algorithm, our method achieves favorable peak signal to noise ratio (PSNR) and superior speckle noise reduction performances.
文摘A new image compression algorithm is proposed based on local visual activity classification and the investigation of the histograms of the small non-overlapping blocks of the differential and angle image. Histograms of the differential blocks are classified according to their visual activities as Unimodal, Bimodal and Multimodal blocks. According to the histogram shape of the differential block and the mean angle of the same block an optimized quantization table with special coding is applied by taking the advantages of the local visual activities within the block. A considerable compression ratio and visual output improvement compared with the DCT compression algorithm are gained.
文摘The amount of image data generated in multimedia applications is ever increasing. The image compression plays vital role in multimedia applications. The ultimate aim of image compression is to reduce storage space without degrading image quality. Compression is required whenever the data handled is huge they may be required to sent or transmitted and also stored. The New Edge Directed Interpolation (NEDI)-based lifting Discrete Wavelet Transfrom (DWT) scheme with modified Set Partitioning In Hierarchical Trees (MSPIHT) algorithm is proposed in this paper. The NEDI algorithm gives good visual quality image particularly at edges. The main objective of this paper is to be preserving the edges while performing image compression which is a challenging task. The NEDI with lifting DWT has achieved 99.18% energy level in the low frequency ranges which has 1.07% higher than 5/3 Wavelet decomposition and 0.94% higher than traditional DWT. To implement this NEDI with Lifting DWT along with MSPIHT algorithm which gives higher Peak Signal to Noise Ratio (PSNR) value and minimum Mean Square Error (MSE) and hence better image quality. The experimental results proved that the proposed method gives better PSNR value (39.40 dB for rate 0.9 bpp without arithmetic coding) and minimum MSE value is 7.4.
文摘In this paper, three techniques, line run coding, quadtree DF (Depth-First) representation and H coding for compressing classified satellite cloud images with no distortion are presented. In these three codings, the first two were invented by other persons and the third one, by ourselves. As a result, the comparison among their compression rates is. given at the end of this paper. Further application of these image compression technique to satellite data and other meteorological data looks promising.
文摘be stored or transmitted in an efficient form.In this work,a new idea is proposed,where we take advantage of the redundancy that appears in a group of images to be all compressed together,instead of compressing each image by itself.In our proposed technique,a classification process is applied,where the set of the input images are classified into groups based on existing technique like L1 and L2 norms,color histograms.All images that belong to the same group are compressed based on dividing the images of the same group into sub-images of equal sizes and saving the references into a codebook.In the process of extracting the different sub-images,we used the mean squared error for comparison and three blurring methods(simple,middle and majority blurring)to increase the compression ratio.Experiments show that varying blurring values,as well as MSE thresholds,enhanced the compression results in a group of images compared to JPEG and PNG compressors.
文摘When an image, which is decomposed by bi-orthogonal wavelet bases, is reconstructed, some information will be lost at the four edges of the image. At the same time, artificial discontinuities will be introduced. We use a method called symmetric extension to solve the problem. We only consider the case of the two-band filter banks, and the results can be applied to M-band filter banks. There are only two types of symmetric extension in analysis phrase, namely the whole-sample symmetry (WS), the half-sample symmetry (HS), while there are four types of symmetric extension in synthesis phrase, namely the WS, HS, the whole-sample anti-symmetry (WA), and the half-sample anti-symmetry (HA) respectively. We can select the exact type according to the image length and the filter length, and we will show how to do these. The image can be perfectly reconstructed without any edge effects in this way. Finally, simulation results are reported. Key words edge effect - image compression - wavelet - biorthogonal bases - symmetric extension CLC number TP 37 Foundation item: Supported by the National 863 Project (20021111901010)Biography: Yu Sheng-sheng (1944-), male, Professor, research direction: multimedia information processing, SAN.
基金supported by the National Natural Science Foundation of China (Grant Nos. 60573172 and 60973152)the Superior University Doctor Subject Special Scientific Research Foundation of China (Grant No. 20070141014)the Natural Science Foundation of Liaoning Province of China (Grant No. 20082165)
文摘This paper proposes an efficient lossless image compression scheme for still images based on an adaptive arithmetic coding compression algorithm. The algorithm increases the image coding compression rate and ensures the quality of the decoded image combined with the adaptive probability model and predictive coding. The use of adaptive models for each encoded image block dynamically estimates the probability of the relevant image block. The decoded image block can accurately recover the encoded image according to the code book information. We adopt an adaptive arithmetic coding algorithm for image compression that greatly improves the image compression rate. The results show that it is an effective compression technology.