A novel Joint Source and Channel Decoding (JSCD) scheme for Variable Length Codes (VLCs) concatenated with turbo codes utilizing a new super-trellis decoding algorithm is presented in this letter. The basic idea of ou...A novel Joint Source and Channel Decoding (JSCD) scheme for Variable Length Codes (VLCs) concatenated with turbo codes utilizing a new super-trellis decoding algorithm is presented in this letter. The basic idea of our decoding algorithm is that source a priori information with the form of bit transition probabilities corresponding to the VLC tree can be derived directly from sub-state transitions in new composite-state represented super-trellis. A Maximum Likelihood (ML) decoding algorithm for VLC sequence estimations based on the proposed super-trellis is also described. Simu-lation results show that the new iterative decoding scheme can obtain obvious encoding gain especially for Reversible Variable Length Codes (RVLCs),when compared with the classical separated turbo decoding and the previous joint decoding not considering source statistical characteristics.展开更多
This paper proposes an efficient H.264/AVC entropy decoder.It requires no ROM/RAM fabrication process that decreases fabrication cost and increases operation speed.It was achieved by optimizing lookup tables and inter...This paper proposes an efficient H.264/AVC entropy decoder.It requires no ROM/RAM fabrication process that decreases fabrication cost and increases operation speed.It was achieved by optimizing lookup tables and internal buffers,which significantly improves area,speed,and power.The proposed entropy decoder does not exploit embedded processor for bitstream manipulation, which also improves area,speed,and power.Its gate counts and maximum operation frequency are 77515 gates and 175MHz in 0.18um fabrication process,respectively.The proposed entropy decoder needs 2303 cycles in average for one macroblock decoding.It can run at 28MHz to meet the real-time processing requirement for CIF format video decoding on mobile applications.展开更多
This paper presents an efficient VLSI architecture of the contest-based adaptive variable length code (CAVLC) decoder with power optimized for the H.264/advanced video coding (AVC) standard. In the proposed design...This paper presents an efficient VLSI architecture of the contest-based adaptive variable length code (CAVLC) decoder with power optimized for the H.264/advanced video coding (AVC) standard. In the proposed design, according to the regularity of the codewords, the first one detector is used to solve the low efficiency and high power dissipation problem within the traditional method of table-searching. Considering the relevance of the data used in the process of runbefore's decoding, arithmetic operation is combined with finite state machine (FSM), which achieves higher decoding efficiency. According to the CAVLC decoding flow, clock gating is employed in the module level and the register level respectively, which reduces 43% of the overall dynamic power dissipation. The proposed design can decode every syntax element in one clock cycle. When the proposed design is synthesized at the clock constraint of 100 MHz, the synthesis result shows that the design costs 11 300 gates under a 0.25 μm CMOS technology, which meets the demand of real time decoding in the H.264/AVC standard.展开更多
In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results fr...In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results from the neighbouring Huffman coded bits. Simulations demonstrate that in the presence of source redundancy, the proposed algorithm gives better performance than the Separate Source and Channel Decoding algorithm (SSCD).展开更多
This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(...This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(FFT) correlation through hyper code technique and the amount of points in every FFT correlation by using an averaging correlation method. To validate the proposed acquisition performance, the paper applies this algorithm to the real L2C signal collected by the global positioning system(GPS) L2C intermediate frequency(IF) signal sampler—SIS100L2C. The acquisition results show that the proposed modified algorithm can acquire the code phase accurately with less calculation and its acquisition performance is better than the single hyper code method.展开更多
[Objective] This paper aimed to provide a new method for genetic data clustering by analyzing the clustering effect of genetic data clustering algorithm based on the minimum coding length. [Method] The genetic data cl...[Objective] This paper aimed to provide a new method for genetic data clustering by analyzing the clustering effect of genetic data clustering algorithm based on the minimum coding length. [Method] The genetic data clustering was regarded as high dimensional mixed data clustering. After preprocessing genetic data, the dimensions of the genetic data were reduced by principal component analysis, when genetic data presented Gaussian-like distribution. This distribution of genetic data could be clustered effectively through lossy data compression, which clustered the genes based on a simple clustering algorithm. This algorithm could achieve its best clustering result when the length of the codes of encoding clustered genes reached its minimum value. This algorithm and the traditional clustering algorithms were used to do the genetic data clustering of yeast and Arabidopsis, and the effectiveness of the algorithm was verified through genetic clustering internal evaluation and function evaluation. [Result] The clustering effect of the new algorithm in this study was superior to traditional clustering algorithms, and it also avoided the problems of subjective determination of clustering data and sensitiveness to initial clustering center. [Conclusion] This study provides a new clustering method for the genetic data clustering.展开更多
Web page has many redundancies,especially the dynamic html multimedia object.This paper proposes a novel method to employ the commonly used image elements on web pages.Due to the various types of image format and comp...Web page has many redundancies,especially the dynamic html multimedia object.This paper proposes a novel method to employ the commonly used image elements on web pages.Due to the various types of image format and complexity of image contents and their position information,secret message bits could be coded to embed in these complex redundancies.Together with a specific covering code called average run-length-coding,the embedding efficiency could be reduced to a low level and the resulting capacity outperforms traditional content-based image steganography,which modifies the image data itself and causes a real image quality degradation.Our experiment result demonstrates that the proposed method has limited processing latency and high embedding capacity.What’s more,this method has a low algorithm complexity and less image quality distortion compared with existing steganography methods.展开更多
Encryption techniques ensure security of data during transmission. However, in most cases, this increases the length of the data, thus it increases the cost. When it is desired to transmit data over an insecure and ba...Encryption techniques ensure security of data during transmission. However, in most cases, this increases the length of the data, thus it increases the cost. When it is desired to transmit data over an insecure and bandwidth-constrained channel, it is customary to compress the data first and then encrypt it. In this paper, a novel algorithm, the new compression with encryption and compression (CEC), is proposed to secure and compress the data. This algorithm compresses the data to reduce its length. The compressed data is encrypted and then further compressed using a new encryption algorithm without compromising the compression efficiency and the information security. This CEC algorithm provides a higher compression ratio and enhanced data security. The CEC provides more confidentiality and authentication between two communication systems.展开更多
基金Supported by the National Natural Science Foundation of China (No.90304003, No.60573112, No.60272056)the Foundation Project of China (No.A1320061262).
文摘A novel Joint Source and Channel Decoding (JSCD) scheme for Variable Length Codes (VLCs) concatenated with turbo codes utilizing a new super-trellis decoding algorithm is presented in this letter. The basic idea of our decoding algorithm is that source a priori information with the form of bit transition probabilities corresponding to the VLC tree can be derived directly from sub-state transitions in new composite-state represented super-trellis. A Maximum Likelihood (ML) decoding algorithm for VLC sequence estimations based on the proposed super-trellis is also described. Simu-lation results show that the new iterative decoding scheme can obtain obvious encoding gain especially for Reversible Variable Length Codes (RVLCs),when compared with the classical separated turbo decoding and the previous joint decoding not considering source statistical characteristics.
基金sponsored by ETRI System Semiconductor Industry Promotion Center,Human Resource Development Project for SoC Convergence.
文摘This paper proposes an efficient H.264/AVC entropy decoder.It requires no ROM/RAM fabrication process that decreases fabrication cost and increases operation speed.It was achieved by optimizing lookup tables and internal buffers,which significantly improves area,speed,and power.The proposed entropy decoder does not exploit embedded processor for bitstream manipulation, which also improves area,speed,and power.Its gate counts and maximum operation frequency are 77515 gates and 175MHz in 0.18um fabrication process,respectively.The proposed entropy decoder needs 2303 cycles in average for one macroblock decoding.It can run at 28MHz to meet the real-time processing requirement for CIF format video decoding on mobile applications.
基金Project supported by the Applied Materials Shanghai Research and Development Foundation (Grant No.08700741000)the Foundation of Shanghai Municipal Education Commission (Grant No.2006AZ068)
文摘This paper presents an efficient VLSI architecture of the contest-based adaptive variable length code (CAVLC) decoder with power optimized for the H.264/advanced video coding (AVC) standard. In the proposed design, according to the regularity of the codewords, the first one detector is used to solve the low efficiency and high power dissipation problem within the traditional method of table-searching. Considering the relevance of the data used in the process of runbefore's decoding, arithmetic operation is combined with finite state machine (FSM), which achieves higher decoding efficiency. According to the CAVLC decoding flow, clock gating is employed in the module level and the register level respectively, which reduces 43% of the overall dynamic power dissipation. The proposed design can decode every syntax element in one clock cycle. When the proposed design is synthesized at the clock constraint of 100 MHz, the synthesis result shows that the design costs 11 300 gates under a 0.25 μm CMOS technology, which meets the demand of real time decoding in the H.264/AVC standard.
文摘In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results from the neighbouring Huffman coded bits. Simulations demonstrate that in the presence of source redundancy, the proposed algorithm gives better performance than the Separate Source and Channel Decoding algorithm (SSCD).
基金supported by the Fundamental Research Fund for the Central Universities(NS2013016)
文摘This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(FFT) correlation through hyper code technique and the amount of points in every FFT correlation by using an averaging correlation method. To validate the proposed acquisition performance, the paper applies this algorithm to the real L2C signal collected by the global positioning system(GPS) L2C intermediate frequency(IF) signal sampler—SIS100L2C. The acquisition results show that the proposed modified algorithm can acquire the code phase accurately with less calculation and its acquisition performance is better than the single hyper code method.
文摘[Objective] This paper aimed to provide a new method for genetic data clustering by analyzing the clustering effect of genetic data clustering algorithm based on the minimum coding length. [Method] The genetic data clustering was regarded as high dimensional mixed data clustering. After preprocessing genetic data, the dimensions of the genetic data were reduced by principal component analysis, when genetic data presented Gaussian-like distribution. This distribution of genetic data could be clustered effectively through lossy data compression, which clustered the genes based on a simple clustering algorithm. This algorithm could achieve its best clustering result when the length of the codes of encoding clustered genes reached its minimum value. This algorithm and the traditional clustering algorithms were used to do the genetic data clustering of yeast and Arabidopsis, and the effectiveness of the algorithm was verified through genetic clustering internal evaluation and function evaluation. [Result] The clustering effect of the new algorithm in this study was superior to traditional clustering algorithms, and it also avoided the problems of subjective determination of clustering data and sensitiveness to initial clustering center. [Conclusion] This study provides a new clustering method for the genetic data clustering.
基金This work is supported in part by the First Batch of Youth Innovation Fund Projects in 2020 under Grant No.3502Z202006012the Experimental Teaching Reform Project of National Huaqiao University under Grant No.SY2019L013.
文摘Web page has many redundancies,especially the dynamic html multimedia object.This paper proposes a novel method to employ the commonly used image elements on web pages.Due to the various types of image format and complexity of image contents and their position information,secret message bits could be coded to embed in these complex redundancies.Together with a specific covering code called average run-length-coding,the embedding efficiency could be reduced to a low level and the resulting capacity outperforms traditional content-based image steganography,which modifies the image data itself and causes a real image quality degradation.Our experiment result demonstrates that the proposed method has limited processing latency and high embedding capacity.What’s more,this method has a low algorithm complexity and less image quality distortion compared with existing steganography methods.
文摘Encryption techniques ensure security of data during transmission. However, in most cases, this increases the length of the data, thus it increases the cost. When it is desired to transmit data over an insecure and bandwidth-constrained channel, it is customary to compress the data first and then encrypt it. In this paper, a novel algorithm, the new compression with encryption and compression (CEC), is proposed to secure and compress the data. This algorithm compresses the data to reduce its length. The compressed data is encrypted and then further compressed using a new encryption algorithm without compromising the compression efficiency and the information security. This CEC algorithm provides a higher compression ratio and enhanced data security. The CEC provides more confidentiality and authentication between two communication systems.