In this study,the micro-failure process and failure mechanism of a typical brittle rock under uniaxial compression are investigated via continuous real-time measurement of wave velocities.The experimental results indi...In this study,the micro-failure process and failure mechanism of a typical brittle rock under uniaxial compression are investigated via continuous real-time measurement of wave velocities.The experimental results indicate that the evolutions of wave velocities became progressively anisotropic under uniaxial loading due to the direction-dependent development of micro-damage.A wave velocity model considering the inner anisotropic crack evolution is proposed to accurately describe the variations of wave velocities during uniaxial compression testing.Based on which,the effective elastic parameters are inferred by a transverse isotropic constitutive model,and the evolutions of the crack density are inversed using a self-consistent damage model.It is found that the propagation of axial cracks dominates the failure process of brittle rock under uniaxial loading and oblique shear cracks develop with the appearance of macrocrack.展开更多
By investigating the limitation of existing wavelet tree based image compression methods, we propose a novel wavelet fractal image compression method in this paper. Briefly, the initial errors are appointed given the ...By investigating the limitation of existing wavelet tree based image compression methods, we propose a novel wavelet fractal image compression method in this paper. Briefly, the initial errors are appointed given the different levels of importance accorded the frequency sublevel band wavelet coefficients. Higher frequency sublevel bands would lead to larger initial errors. As a result, the sizes of sublevel blocks and super blocks would be changed according to the initial errors. The matching sizes between sublevel blocks and super blocks would be changed according to the permitted errors and compression rates. Systematic analyses are performed and the experimental results demonstrate that the proposed method provides a satisfactory performance with a clearly increasing rate of compression and speed of encoding without reducing SNR and the quality of decoded images. Simulation results show that our method is superior to the traditional wavelet tree based methods of fractal image compression.展开更多
The main problems in three-dimensional gravity inversion are the non-uniqueness of the solutions and the high computational cost of large data sets. To minimize the high computational cost, we propose a new sorting me...The main problems in three-dimensional gravity inversion are the non-uniqueness of the solutions and the high computational cost of large data sets. To minimize the high computational cost, we propose a new sorting method to reduce fluctuations and the high frequency of the sensitivity matrix prior to applying the wavelet transform. Consequently, the sparsity and compression ratio of the sensitivity matrix are improved as well as the accuracy of the forward modeling. Furthermore, memory storage requirements are reduced and the forward modeling is accelerated compared with uncompressed forward modeling. The forward modeling results suggest that the compression ratio of the sensitivity matrix can be more than 300. Furthermore, multiscale inversion based on the wavelet transform is applied to gravity inversion. By decomposing the gravity inversion into subproblems of different scales, the non-uniqueness and stability of the gravity inversion are improved as multiscale data are considered. Finally, we applied conventional focusing inversion and multiscale inversion on simulated and measured data to demonstrate the effectiveness of the proposed gravity inversion method.展开更多
The paper presents a class of nonlinear adaptive wavelet transforms for lossless image compression. In update step of the lifting the different operators are chosen by the local gradient of original image. A nonlinear...The paper presents a class of nonlinear adaptive wavelet transforms for lossless image compression. In update step of the lifting the different operators are chosen by the local gradient of original image. A nonlinear morphological predictor follows the update adaptive lifting to result in fewer large wavelet coefficients near edges for reducing coding. The nonlinear adaptive wavelet transforms can also allow perfect reconstruction without any overhead cost. Experiment results are given to show lower entropy of the adaptive transformed images than those of the non-adaptive case and great applicable potentiality in lossless image compresslon.展开更多
Due to the particularity of the seismic data, they must be treated by lossless compression algorithm in some cases. In the paper, based on the integer wavelet transform, the lossless compression algorithm is studied....Due to the particularity of the seismic data, they must be treated by lossless compression algorithm in some cases. In the paper, based on the integer wavelet transform, the lossless compression algorithm is studied. Comparing with the traditional algorithm, it can better improve the compression rate. CDF (2, n) biorthogonal wavelet family can lead to better compression ratio than other CDF family, SWE and CRF, which is owe to its capability in can- celing data redundancies and focusing data characteristics. CDF (2, n) family is suitable as the wavelet function of the lossless compression seismic data.展开更多
DEM data is an important component of spatial database in GIS. The data volume is so huge that compression is necessary. Wavelet transform has many advantages and has become a trend in data compression. Considering th...DEM data is an important component of spatial database in GIS. The data volume is so huge that compression is necessary. Wavelet transform has many advantages and has become a trend in data compression. Considering the simplicity and high efficiency of the compression system, integer wavelet transform is applied to DEM and a simple coding algorithm with high efficiency is introduced. Experiments on a variety of DEM are carried out and some useful rules are presented at the end of this paper.展开更多
This paper presents a novel method utilizing wavelets with particle swarm optimization(PSO)for medical image compression.Our method utilizes PSO to overcome the wavelets discontinuity which occurs when compressing ima...This paper presents a novel method utilizing wavelets with particle swarm optimization(PSO)for medical image compression.Our method utilizes PSO to overcome the wavelets discontinuity which occurs when compressing images using thresholding.It transfers images into subband details and approximations using a modified Haar wavelet(MHW),and then applies a threshold.PSO is applied for selecting a particle assigned to the threshold values for the subbands.Nine positions assigned to particles values are used to represent population.Every particle updates its position depending on the global best position(gbest)(for all details subband)and local best position(pbest)(for a subband).The fitness value is developed to terminate PSO when the difference between two local best(pbest)successors is smaller than a prescribe value.The experiments are applied on five different medical image types,i.e.,MRI,CT,and X-ray.Results show that the proposed algorithm can be more preferably to compress medical images than other existing wavelets techniques from peak signal to noise ratio(PSNR)and compression ratio(CR)points of views.展开更多
In this paper, the second generation wavelet transform is applied to image lossless coding, according to its characteristic of reversible integer wavelet transform. The second generation wavelet transform can provide ...In this paper, the second generation wavelet transform is applied to image lossless coding, according to its characteristic of reversible integer wavelet transform. The second generation wavelet transform can provide higher compression ratio than Huffman coding while it reconstructs image without loss compared with the first generation wavelet transform. The experimental results show that the se cond generation wavelet transform can obtain excellent performance in medical image compression coding.展开更多
A new adaptive Packet algorithm based on Discrete Cosine harmonic wavelet transform (DCHWT), (DCAHWP) has been proposed. This is realized by the Discrete Cosine Harmonic Wavelet transform (DCHTWT) which exploits the g...A new adaptive Packet algorithm based on Discrete Cosine harmonic wavelet transform (DCHWT), (DCAHWP) has been proposed. This is realized by the Discrete Cosine Harmonic Wavelet transform (DCHTWT) which exploits the good properties of DCT viz., energy compaction (low leakage), frequency resolution and computational simplicity due its real nature, compared to those of DFT and its harmonic wavelet version. Hence the proposed wavelet packet is advantageous both in terms of performance and computational efficiency compared to those of existing DFT harmonic wavelet packet. Further, the new DCAHWP also enjoys the desirable properties of a Harmonic wavelet transform over the time domain WT, viz., built in decimation without any explicit antialiasing filtering and easy interpolation by mere concatenation of different scales in frequency (DCT) domain with out any image rejection filter and with out laborious delay compensation required. Further, the compression by the proposed DCAHWP is much better compared to that by adaptive WP based on Daubechies-2 wavelet (DBAWP). For a compression factor (CF) of 1/8, the ratio of the percentage error energy by proposed DCAHWP to that by DBAWP is about 1/8 and 1/5 for considered 1-D signal and speech signal, respectively. Its compression performance is better than that of DCHWT, both for 1-D and 2-D signals. The improvement is more significant for signals with abrupt changes or images with rapid variations (textures). For compression factor of 1/8, the ratio of the percentage error energy by DCAHWP to that by DCHWT, is about 1/3 and 1/2, for the considered 1-D signal and speech signal, respectively. This factor for an image considered is 2/3 and in particular for a textural image it is 1/5.展开更多
In this paper the embedded zerotree wavelet (EZW) method and Huffman coding are proposed to compress infrared (IR) spectra. We found that this technique is much better than others in terms of efficiently coding wavele...In this paper the embedded zerotree wavelet (EZW) method and Huffman coding are proposed to compress infrared (IR) spectra. We found that this technique is much better than others in terms of efficiently coding wavelet coefficients because the zerotree quantization is an effective way of exploiting the self-similarities of wavelet coefficients at various resolutions.展开更多
When an image, which is decomposed by bi-orthogonal wavelet bases, is reconstructed, some information will be lost at the four edges of the image. At the same time, artificial discontinuities will be introduced. We us...When an image, which is decomposed by bi-orthogonal wavelet bases, is reconstructed, some information will be lost at the four edges of the image. At the same time, artificial discontinuities will be introduced. We use a method called symmetric extension to solve the problem. We only consider the case of the two-band filter banks, and the results can be applied to M-band filter banks. There are only two types of symmetric extension in analysis phrase, namely the whole-sample symmetry (WS), the half-sample symmetry (HS), while there are four types of symmetric extension in synthesis phrase, namely the WS, HS, the whole-sample anti-symmetry (WA), and the half-sample anti-symmetry (HA) respectively. We can select the exact type according to the image length and the filter length, and we will show how to do these. The image can be perfectly reconstructed without any edge effects in this way. Finally, simulation results are reported. Key words edge effect - image compression - wavelet - biorthogonal bases - symmetric extension CLC number TP 37 Foundation item: Supported by the National 863 Project (20021111901010)Biography: Yu Sheng-sheng (1944-), male, Professor, research direction: multimedia information processing, SAN.展开更多
In this paper a square wavelet thresholding method is proposed and evaluated as compared to the other classical wavelet thresholding methods (like soft and hard). The main advantage of this work is to design and imple...In this paper a square wavelet thresholding method is proposed and evaluated as compared to the other classical wavelet thresholding methods (like soft and hard). The main advantage of this work is to design and implement a new wavelet thresholding method and evaluate it against other classical wavelet thresholding methods and hence search for the optimal wavelet mother function among the wide families with a suitable level of decomposition and followed by a novel thresholding method among the existing methods. This optimized method will be used to shrink the wavelet coefficients and yield an adequate compressed pressure signal prior to transmit it. While a comparison evaluation analysis is established, A new proposed procedure is used to compress a synthetic signal and obtain the optimal results through minimization the signal memory size and its transmission bandwidth. There are different performance indices to establish the comparison and evaluation process for signal compression;but the most well-known measuring scores are: NMSE, ESNR, and PDR. The obtained results showed the dominant of the square wavelet thresholding method against other methods using different measuring scores and hence the conclusion by the way for adopting this proposed novel wavelet thresholding method for 1D signal compression in future researches.展开更多
We study an approach to integer wavelet transform for lossless compression of medical image in medical picture archiving and communication system (PACS). By lifting scheme a reversible integer wavelet transform is gen...We study an approach to integer wavelet transform for lossless compression of medical image in medical picture archiving and communication system (PACS). By lifting scheme a reversible integer wavelet transform is generated, which has the similar features with the corresponding biorthogonal wavelet transform. Experimental results of the method based on integer wavelet transform are given to show better performance and great applicable potentiality in medical image compression.展开更多
With volume size increasing, it is necessary to develop a highly efficient compression algorithm, which is suitable for progressive refinement between the data server and the browsing client. For three-dimensional lar...With volume size increasing, it is necessary to develop a highly efficient compression algorithm, which is suitable for progressive refinement between the data server and the browsing client. For three-dimensional large volume data, an efficient hierarchical algorithm based on wavelet compression was presented, using intra-band dependencies of wavelet coefficients. Firstly, after applying blockwise hierarchical wavelet decomposition to large volume data, the block significance map was obtained by using one bit to indicate significance or insignificance of the block. Secondly, the coefficient block was further subdivided into eight sub-blocks if any significant coefficient existed in it, and the process was repeated, resulting in an incomplete octree. One bit was used to indicate significance or insignificance, and only significant coefficients were stored in the data stream. Finally, the significant coefficients were quantified and compressed by arithmetic coding. The experimental results show that the proposed algorithm achieves good compression ratios and is suited for random access of data blocks. The results also show that the proposed algorithm can be applied to progressive transmission of 3D volume data.展开更多
The aggregation of data in recent years has been expanding at an exponential rate. There are various data generating sources that are responsible for such a tremendous data growth rate. Some of the data origins includ...The aggregation of data in recent years has been expanding at an exponential rate. There are various data generating sources that are responsible for such a tremendous data growth rate. Some of the data origins include data from the various social media, footages from video cameras, wireless and wired sensor network measurements, data from the stock markets and other financial transaction data, supermarket transaction data and so on. The aforementioned data may be high dimensional and big in Volume, Value, Velocity, Variety, and Veracity. Hence one of the crucial challenges is the storage, processing and extraction of relevant information from the data. In the special case of image data, the technique of image compressions may be employed in reducing the dimension and volume of the data to ensure it is convenient for processing and analysis. In this work, we examine a proof-of-concept multiresolution analytics that uses wavelet transforms, that is one popular mathematical and analytical framework employed in signal processing and representations, and we study its applications to the area of compressing image data in wireless sensor networks. The proposed approach consists of the applications of wavelet transforms, threshold detections, quantization data encoding and ultimately apply the inverse transforms. The work specifically focuses on multi-resolution analysis with wavelet transforms by comparing 3 wavelets at the 5 decomposition levels. Simulation results are provided to demonstrate the effectiveness of the methodology.展开更多
In this paper, a new mesh based algorithm is applied for motion estimation and compensation in the wavelet domain. The first major contribution of this work is the introduction of a new active mesh based method for mo...In this paper, a new mesh based algorithm is applied for motion estimation and compensation in the wavelet domain. The first major contribution of this work is the introduction of a new active mesh based method for motion estimation and compensation. The proposed algorithm is based on the mesh energy minimization with novel sets of energy functions. The proposed energy functions have appropriate features, which improve the accuracy of motion estimation and compensation algorithm. We employ the proposed motion estimation algorithm in two different manners for video compression. In the first approach, the proposed algorithm is employed for motion estimation of consecutive frames. In the second approach, the algorithm is applied for motion estimation and compensation in the wavelet sub-bands. The experimental results reveal that the incorporation of active mesh based motion-compensated temporal filtering into wavelet sub-bands significantly improves the distortion performance rate of the video compression. We also use a new wavelet coder for the coding of the 3D volume of coefficients based on the retained energy criteria. This coder gives the maximum retained energy in all sub-bands. The proposed algorithm was tested with some video sequences and the results showed that the use of the proposed active mesh method for motion compensation and its implementation in sub-bands yields significant improvement in PSNR performance.展开更多
In this paper image quality of two types of compression methods, wavelet based and seam carving based are investigated. A metric is introduced to compare the image quality under wavelet and seam carving schemes. Meyer...In this paper image quality of two types of compression methods, wavelet based and seam carving based are investigated. A metric is introduced to compare the image quality under wavelet and seam carving schemes. Meyer, Coiflet 2 and Jpeg2000 wavelet based methods are used as the wavelet based methods. Hausdorf distance based metric (HDM) is proposed and used for the comparison of the two compression methods instead of model based matching techniques and correspondence-based matching techniques, because there is no pairing of points in the two sets being compared. In addition entropy based metric (EM) or peak signal to noise ration based metric (PSNRM) cannot be used to compare the two schemes as the seam carving tends to deform the objects. The wavelet compressed images with different compression percentages were analyzed with HDM and EM and it was observed that HDM follows the EM/PSNRM for wavelet based compression methods. Then HDM is used to compare the wavelet and seam carved images for different compression percentages. The initial results showed that HDM is better metric for comparing wavelet based and seam carved images.展开更多
Based on explanation of wavelet fractal compression method, the significance of introducing wavelet decomposition into conventional fractal compression method is deeply investigated from the point of theoretical and p...Based on explanation of wavelet fractal compression method, the significance of introducing wavelet decomposition into conventional fractal compression method is deeply investigated from the point of theoretical and practical view. The result of study can be regarded as valuable guidelines for taking advantages of wavelet transform to develop more effective image compression algorithm.展开更多
To utilize residual redundancy to reduce the error induced by fading channels and decrease the complexity of the field model to describe the probability structure for residual redundancy, a simplified statistical mode...To utilize residual redundancy to reduce the error induced by fading channels and decrease the complexity of the field model to describe the probability structure for residual redundancy, a simplified statistical model for residual redundancy and a low complexity joint source-channel decoding(JSCD) algorithm are proposed. The complicated residual redundancy in wavelet compressed images is decomposed into several independent 1-D probability check equations composed of Markov chains and it is regarded as a natural channel code with a structure similar to the low density parity check (LDPC) code. A parallel sum-product (SP) and iterative JSCD algorithm is proposed. Simulation results show that the proposed JSCD algorithm can make full use of residual redundancy in different directions to correct errors and improve the peak signal noise ratio (PSNR) of the reconstructed image and reduce the complexity and delay of JSCD. The performance of JSCD is more robust than the traditional separated encoding system with arithmetic coding in the same data rate.展开更多
基金Projects(41502283,41772309)supported by the National Natural Science Foundation of ChinaProject(2017YFC1501302)supported by the National Key Research and Development Program of ChinaProject(2017ACA102)supported by the Major Program of Technological Innovation of Hubei Province,China。
文摘In this study,the micro-failure process and failure mechanism of a typical brittle rock under uniaxial compression are investigated via continuous real-time measurement of wave velocities.The experimental results indicate that the evolutions of wave velocities became progressively anisotropic under uniaxial loading due to the direction-dependent development of micro-damage.A wave velocity model considering the inner anisotropic crack evolution is proposed to accurately describe the variations of wave velocities during uniaxial compression testing.Based on which,the effective elastic parameters are inferred by a transverse isotropic constitutive model,and the evolutions of the crack density are inversed using a self-consistent damage model.It is found that the propagation of axial cracks dominates the failure process of brittle rock under uniaxial loading and oblique shear cracks develop with the appearance of macrocrack.
基金Project 60571049 supported by the National Natural Science Foundation of China
文摘By investigating the limitation of existing wavelet tree based image compression methods, we propose a novel wavelet fractal image compression method in this paper. Briefly, the initial errors are appointed given the different levels of importance accorded the frequency sublevel band wavelet coefficients. Higher frequency sublevel bands would lead to larger initial errors. As a result, the sizes of sublevel blocks and super blocks would be changed according to the initial errors. The matching sizes between sublevel blocks and super blocks would be changed according to the permitted errors and compression rates. Systematic analyses are performed and the experimental results demonstrate that the proposed method provides a satisfactory performance with a clearly increasing rate of compression and speed of encoding without reducing SNR and the quality of decoded images. Simulation results show that our method is superior to the traditional wavelet tree based methods of fractal image compression.
基金This work was supported by the Key National Research Project of China (Nos. 2017YFC0601900 and 2016YFC0303100) and the Key Program of National Natural Science Foundation of China (Nos. 41530320 and 41774125).
文摘The main problems in three-dimensional gravity inversion are the non-uniqueness of the solutions and the high computational cost of large data sets. To minimize the high computational cost, we propose a new sorting method to reduce fluctuations and the high frequency of the sensitivity matrix prior to applying the wavelet transform. Consequently, the sparsity and compression ratio of the sensitivity matrix are improved as well as the accuracy of the forward modeling. Furthermore, memory storage requirements are reduced and the forward modeling is accelerated compared with uncompressed forward modeling. The forward modeling results suggest that the compression ratio of the sensitivity matrix can be more than 300. Furthermore, multiscale inversion based on the wavelet transform is applied to gravity inversion. By decomposing the gravity inversion into subproblems of different scales, the non-uniqueness and stability of the gravity inversion are improved as multiscale data are considered. Finally, we applied conventional focusing inversion and multiscale inversion on simulated and measured data to demonstrate the effectiveness of the proposed gravity inversion method.
基金Supported by the National Natural Science Foundation of China (69983005)
文摘The paper presents a class of nonlinear adaptive wavelet transforms for lossless image compression. In update step of the lifting the different operators are chosen by the local gradient of original image. A nonlinear morphological predictor follows the update adaptive lifting to result in fewer large wavelet coefficients near edges for reducing coding. The nonlinear adaptive wavelet transforms can also allow perfect reconstruction without any overhead cost. Experiment results are given to show lower entropy of the adaptive transformed images than those of the non-adaptive case and great applicable potentiality in lossless image compresslon.
文摘Due to the particularity of the seismic data, they must be treated by lossless compression algorithm in some cases. In the paper, based on the integer wavelet transform, the lossless compression algorithm is studied. Comparing with the traditional algorithm, it can better improve the compression rate. CDF (2, n) biorthogonal wavelet family can lead to better compression ratio than other CDF family, SWE and CRF, which is owe to its capability in can- celing data redundancies and focusing data characteristics. CDF (2, n) family is suitable as the wavelet function of the lossless compression seismic data.
文摘DEM data is an important component of spatial database in GIS. The data volume is so huge that compression is necessary. Wavelet transform has many advantages and has become a trend in data compression. Considering the simplicity and high efficiency of the compression system, integer wavelet transform is applied to DEM and a simple coding algorithm with high efficiency is introduced. Experiments on a variety of DEM are carried out and some useful rules are presented at the end of this paper.
基金funded by the University of Jeddah,Saudi Arabia,under Grant No.UJ-20-043-DR。
文摘This paper presents a novel method utilizing wavelets with particle swarm optimization(PSO)for medical image compression.Our method utilizes PSO to overcome the wavelets discontinuity which occurs when compressing images using thresholding.It transfers images into subband details and approximations using a modified Haar wavelet(MHW),and then applies a threshold.PSO is applied for selecting a particle assigned to the threshold values for the subbands.Nine positions assigned to particles values are used to represent population.Every particle updates its position depending on the global best position(gbest)(for all details subband)and local best position(pbest)(for a subband).The fitness value is developed to terminate PSO when the difference between two local best(pbest)successors is smaller than a prescribe value.The experiments are applied on five different medical image types,i.e.,MRI,CT,and X-ray.Results show that the proposed algorithm can be more preferably to compress medical images than other existing wavelets techniques from peak signal to noise ratio(PSNR)and compression ratio(CR)points of views.
基金Supported by the National Natural Science Foundation of China!( 6 9875 0 0 9)
文摘In this paper, the second generation wavelet transform is applied to image lossless coding, according to its characteristic of reversible integer wavelet transform. The second generation wavelet transform can provide higher compression ratio than Huffman coding while it reconstructs image without loss compared with the first generation wavelet transform. The experimental results show that the se cond generation wavelet transform can obtain excellent performance in medical image compression coding.
文摘A new adaptive Packet algorithm based on Discrete Cosine harmonic wavelet transform (DCHWT), (DCAHWP) has been proposed. This is realized by the Discrete Cosine Harmonic Wavelet transform (DCHTWT) which exploits the good properties of DCT viz., energy compaction (low leakage), frequency resolution and computational simplicity due its real nature, compared to those of DFT and its harmonic wavelet version. Hence the proposed wavelet packet is advantageous both in terms of performance and computational efficiency compared to those of existing DFT harmonic wavelet packet. Further, the new DCAHWP also enjoys the desirable properties of a Harmonic wavelet transform over the time domain WT, viz., built in decimation without any explicit antialiasing filtering and easy interpolation by mere concatenation of different scales in frequency (DCT) domain with out any image rejection filter and with out laborious delay compensation required. Further, the compression by the proposed DCAHWP is much better compared to that by adaptive WP based on Daubechies-2 wavelet (DBAWP). For a compression factor (CF) of 1/8, the ratio of the percentage error energy by proposed DCAHWP to that by DBAWP is about 1/8 and 1/5 for considered 1-D signal and speech signal, respectively. Its compression performance is better than that of DCHWT, both for 1-D and 2-D signals. The improvement is more significant for signals with abrupt changes or images with rapid variations (textures). For compression factor of 1/8, the ratio of the percentage error energy by DCAHWP to that by DCHWT, is about 1/3 and 1/2, for the considered 1-D signal and speech signal, respectively. This factor for an image considered is 2/3 and in particular for a textural image it is 1/5.
基金supported by the National Natural Science Foundmion of China(No.29877016).
文摘In this paper the embedded zerotree wavelet (EZW) method and Huffman coding are proposed to compress infrared (IR) spectra. We found that this technique is much better than others in terms of efficiently coding wavelet coefficients because the zerotree quantization is an effective way of exploiting the self-similarities of wavelet coefficients at various resolutions.
文摘When an image, which is decomposed by bi-orthogonal wavelet bases, is reconstructed, some information will be lost at the four edges of the image. At the same time, artificial discontinuities will be introduced. We use a method called symmetric extension to solve the problem. We only consider the case of the two-band filter banks, and the results can be applied to M-band filter banks. There are only two types of symmetric extension in analysis phrase, namely the whole-sample symmetry (WS), the half-sample symmetry (HS), while there are four types of symmetric extension in synthesis phrase, namely the WS, HS, the whole-sample anti-symmetry (WA), and the half-sample anti-symmetry (HA) respectively. We can select the exact type according to the image length and the filter length, and we will show how to do these. The image can be perfectly reconstructed without any edge effects in this way. Finally, simulation results are reported. Key words edge effect - image compression - wavelet - biorthogonal bases - symmetric extension CLC number TP 37 Foundation item: Supported by the National 863 Project (20021111901010)Biography: Yu Sheng-sheng (1944-), male, Professor, research direction: multimedia information processing, SAN.
文摘In this paper a square wavelet thresholding method is proposed and evaluated as compared to the other classical wavelet thresholding methods (like soft and hard). The main advantage of this work is to design and implement a new wavelet thresholding method and evaluate it against other classical wavelet thresholding methods and hence search for the optimal wavelet mother function among the wide families with a suitable level of decomposition and followed by a novel thresholding method among the existing methods. This optimized method will be used to shrink the wavelet coefficients and yield an adequate compressed pressure signal prior to transmit it. While a comparison evaluation analysis is established, A new proposed procedure is used to compress a synthetic signal and obtain the optimal results through minimization the signal memory size and its transmission bandwidth. There are different performance indices to establish the comparison and evaluation process for signal compression;but the most well-known measuring scores are: NMSE, ESNR, and PDR. The obtained results showed the dominant of the square wavelet thresholding method against other methods using different measuring scores and hence the conclusion by the way for adopting this proposed novel wavelet thresholding method for 1D signal compression in future researches.
文摘We study an approach to integer wavelet transform for lossless compression of medical image in medical picture archiving and communication system (PACS). By lifting scheme a reversible integer wavelet transform is generated, which has the similar features with the corresponding biorthogonal wavelet transform. Experimental results of the method based on integer wavelet transform are given to show better performance and great applicable potentiality in medical image compression.
基金Supported by Natural Science Foundation of China (No. 60373061).
文摘With volume size increasing, it is necessary to develop a highly efficient compression algorithm, which is suitable for progressive refinement between the data server and the browsing client. For three-dimensional large volume data, an efficient hierarchical algorithm based on wavelet compression was presented, using intra-band dependencies of wavelet coefficients. Firstly, after applying blockwise hierarchical wavelet decomposition to large volume data, the block significance map was obtained by using one bit to indicate significance or insignificance of the block. Secondly, the coefficient block was further subdivided into eight sub-blocks if any significant coefficient existed in it, and the process was repeated, resulting in an incomplete octree. One bit was used to indicate significance or insignificance, and only significant coefficients were stored in the data stream. Finally, the significant coefficients were quantified and compressed by arithmetic coding. The experimental results show that the proposed algorithm achieves good compression ratios and is suited for random access of data blocks. The results also show that the proposed algorithm can be applied to progressive transmission of 3D volume data.
文摘The aggregation of data in recent years has been expanding at an exponential rate. There are various data generating sources that are responsible for such a tremendous data growth rate. Some of the data origins include data from the various social media, footages from video cameras, wireless and wired sensor network measurements, data from the stock markets and other financial transaction data, supermarket transaction data and so on. The aforementioned data may be high dimensional and big in Volume, Value, Velocity, Variety, and Veracity. Hence one of the crucial challenges is the storage, processing and extraction of relevant information from the data. In the special case of image data, the technique of image compressions may be employed in reducing the dimension and volume of the data to ensure it is convenient for processing and analysis. In this work, we examine a proof-of-concept multiresolution analytics that uses wavelet transforms, that is one popular mathematical and analytical framework employed in signal processing and representations, and we study its applications to the area of compressing image data in wireless sensor networks. The proposed approach consists of the applications of wavelet transforms, threshold detections, quantization data encoding and ultimately apply the inverse transforms. The work specifically focuses on multi-resolution analysis with wavelet transforms by comparing 3 wavelets at the 5 decomposition levels. Simulation results are provided to demonstrate the effectiveness of the methodology.
文摘In this paper, a new mesh based algorithm is applied for motion estimation and compensation in the wavelet domain. The first major contribution of this work is the introduction of a new active mesh based method for motion estimation and compensation. The proposed algorithm is based on the mesh energy minimization with novel sets of energy functions. The proposed energy functions have appropriate features, which improve the accuracy of motion estimation and compensation algorithm. We employ the proposed motion estimation algorithm in two different manners for video compression. In the first approach, the proposed algorithm is employed for motion estimation of consecutive frames. In the second approach, the algorithm is applied for motion estimation and compensation in the wavelet sub-bands. The experimental results reveal that the incorporation of active mesh based motion-compensated temporal filtering into wavelet sub-bands significantly improves the distortion performance rate of the video compression. We also use a new wavelet coder for the coding of the 3D volume of coefficients based on the retained energy criteria. This coder gives the maximum retained energy in all sub-bands. The proposed algorithm was tested with some video sequences and the results showed that the use of the proposed active mesh method for motion compensation and its implementation in sub-bands yields significant improvement in PSNR performance.
文摘In this paper image quality of two types of compression methods, wavelet based and seam carving based are investigated. A metric is introduced to compare the image quality under wavelet and seam carving schemes. Meyer, Coiflet 2 and Jpeg2000 wavelet based methods are used as the wavelet based methods. Hausdorf distance based metric (HDM) is proposed and used for the comparison of the two compression methods instead of model based matching techniques and correspondence-based matching techniques, because there is no pairing of points in the two sets being compared. In addition entropy based metric (EM) or peak signal to noise ration based metric (PSNRM) cannot be used to compare the two schemes as the seam carving tends to deform the objects. The wavelet compressed images with different compression percentages were analyzed with HDM and EM and it was observed that HDM follows the EM/PSNRM for wavelet based compression methods. Then HDM is used to compare the wavelet and seam carved images for different compression percentages. The initial results showed that HDM is better metric for comparing wavelet based and seam carved images.
基金This project is supported by the National Natural Science Foundation of China (No. 69774030) Foundation for University Key Teacher by the Ministry of Education.
文摘Based on explanation of wavelet fractal compression method, the significance of introducing wavelet decomposition into conventional fractal compression method is deeply investigated from the point of theoretical and practical view. The result of study can be regarded as valuable guidelines for taking advantages of wavelet transform to develop more effective image compression algorithm.
文摘To utilize residual redundancy to reduce the error induced by fading channels and decrease the complexity of the field model to describe the probability structure for residual redundancy, a simplified statistical model for residual redundancy and a low complexity joint source-channel decoding(JSCD) algorithm are proposed. The complicated residual redundancy in wavelet compressed images is decomposed into several independent 1-D probability check equations composed of Markov chains and it is regarded as a natural channel code with a structure similar to the low density parity check (LDPC) code. A parallel sum-product (SP) and iterative JSCD algorithm is proposed. Simulation results show that the proposed JSCD algorithm can make full use of residual redundancy in different directions to correct errors and improve the peak signal noise ratio (PSNR) of the reconstructed image and reduce the complexity and delay of JSCD. The performance of JSCD is more robust than the traditional separated encoding system with arithmetic coding in the same data rate.