In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results fr...In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results from the neighbouring Huffman coded bits. Simulations demonstrate that in the presence of source redundancy, the proposed algorithm gives better performance than the Separate Source and Channel Decoding algorithm (SSCD).展开更多
With over 10 million points of genetic variation from person to person, every individual’s genome is unique and provides a highly reliable form of identification. This is because the genetic code is specific to each ...With over 10 million points of genetic variation from person to person, every individual’s genome is unique and provides a highly reliable form of identification. This is because the genetic code is specific to each individual and does not change over time. Genetic information has been used to identify individuals in a variety of contexts, such as criminal investigations, paternity tests, and medical research. In this study, each individual’s genetic makeup has been formatted to create a secure, unique code that incorporates various elements, such as species, gender, and the genetic identification code itself. The combinations of markers required for this code have been derived from common single nucleotide polymorphisms (SNPs), points of variation found in the human genome. The final output is in the form of a 24 numerical code with each number having three possible combinations. The custom code can then be utilized to create various modes of identification on the decentralized blockchain network as well as personalized services and products that offer users a novel way to uniquely identify themselves in ways that were not possible before.展开更多
Reversible variable length codes (RVLCs) have received much attention due to their excellent error resilient capabilities. In this paper, a novel construction algorithm for symmetrical RVLC is proposed which is indepe...Reversible variable length codes (RVLCs) have received much attention due to their excellent error resilient capabilities. In this paper, a novel construction algorithm for symmetrical RVLC is proposed which is independent of the Huffman code. The proposed algorithm's codeword assignment is only based on symbol occurrence probability. It has many advantages over symmetrical construction algorithms available for easy realization and better code performance. In addition, the proposed algorithm simplifies the codeword selection mechanism dramatically.展开更多
文章提出一种在片上系统(System on Chip,SoC)实现高吞吐率的有限状态熵编码(finite state entropy,FSE)算法。通过压缩率、速度、资源消耗、功耗4个方面对所提出的编码器和解码器与典型的硬件哈夫曼编码(Huffman coding,HC)进行性能比...文章提出一种在片上系统(System on Chip,SoC)实现高吞吐率的有限状态熵编码(finite state entropy,FSE)算法。通过压缩率、速度、资源消耗、功耗4个方面对所提出的编码器和解码器与典型的硬件哈夫曼编码(Huffman coding,HC)进行性能比较,结果表明,所提出的硬件FSE编码器和解码器具有显著优势。硬件FSE(hFSE)架构实现在SoC的处理系统和可编程逻辑块(programmable logic,PL)上,通过高级可扩展接口(Advanced eXtensible Interface 4,AXI4)总线连接SoC的处理系统和可编程逻辑块。算法测试显示,FSE算法在非均匀数据分布和大数据量情况下,具有更好的压缩率。该文设计的编码器和解码器已在可编程逻辑块上实现,其中包括1个可配置的缓冲模块,将比特流作为单字节或双字节配置输出到8 bit位宽4096深度或16 bit位宽2048深度的块随机访问存储器(block random access memory,BRAM)中。所提出的FSE硬件架构为实时压缩应用提供了高吞吐率、低功耗和低资源消耗的硬件实现。展开更多
Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their perform...Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their performance is exploited with lossy compression techniques for images and videos generally using a mixed approach. To achieve our intended objective, which is to study the performance of lossless compression methods, we first carried out a literature review, a summary of which enabled us to select the most relevant, namely the following: arithmetic coding, LZW, Tunstall’s algorithm, RLE, BWT, Huffman coding and Shannon-Fano. Secondly, we designed a purposive text dataset with a repeating pattern in order to test the behavior and effectiveness of the selected compression techniques. Thirdly, we designed the compression algorithms and developed the programs (scripts) in Matlab in order to test their performance. Finally, following the tests conducted on relevant data that we constructed according to a deliberate model, the results show that these methods presented in order of performance are very satisfactory:- LZW- Arithmetic coding- Tunstall algorithm- BWT + RLELikewise, it appears that on the one hand, the performance of certain techniques relative to others is strongly linked to the sequencing and/or recurrence of symbols that make up the message, and on the other hand, to the cumulative time of encoding and decoding.展开更多
In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectr...In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference be-tween them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortifi-cation criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The re-sults show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period re-sponse spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.展开更多
Long non coding RNAs(lncRNAs) are non-protein or low-protein coding transcripts that contain more than 200 nucleotides. They representing a large share of the cell’s transcriptional output, demonstrate functional att...Long non coding RNAs(lncRNAs) are non-protein or low-protein coding transcripts that contain more than 200 nucleotides. They representing a large share of the cell’s transcriptional output, demonstrate functional attributes viz. tissue-specific expression, determination of cell fate, controlled expression, RNA processing and editing, dosage compensation, genomic imprinting, conserved evolutionary traits etc. These long non coding variants are well associated with pathogenicity of various diseases including the neurological disorders like Alzheimer’s disease, schizophrenia, Huntington’s disease, Parkinson’s disease etc. Neurological disorders are widespread and there knowing the underlying mechanisms become crucial. The lncRNAs take part in the pathogenesis by a plethora of mechanisms like decoy, scaffold, mi-RNA sequestrator, histone modifiers and in transcriptional interference. Detailed knowledge of the role of lncRNAs can help to use them further as novel biomarkers for therapeutic aspects. Here, in this review we discuss regulation and functional roles of lncRNAs in eight neurological diseases and psychiatric disorders, and the mechanisms by which they act. With these, we try to establish their roles as potential markers and viable diagnostic tools in these disorders.展开更多
In Systems Biology, system identification, which infers regulatory network in genetic system and metabolic pathways using experimentally observed time-course data, is one of the hottest issues. The efficient numerical...In Systems Biology, system identification, which infers regulatory network in genetic system and metabolic pathways using experimentally observed time-course data, is one of the hottest issues. The efficient numerical optimization algorithm to estimate more than 100 real-coded parameters should be developed for this purpose. New real-coded genetic algorithm (RCGA), the combination of AREX (adaptive real-coded ensemble crossover) with JGG (just generation gap), have applied to the inference of genetic interactions involving more than 100 parameters related to the interactions with using experimentally observed time-course data. Compared with conventional RCGA, the combination of UNDX (unimodal normal distribution crossover) with MGG (minimal generation gap), new algorithm has shown the superiority with improving early convergence in the first stage of search and suppressing evolutionary stagnation in the last stage of search.展开更多
In the present communication, we have obtained the optimum probability distribution with which the messages should be delivered so that the average redundancy of the source is minimized. Here, we have taken the case o...In the present communication, we have obtained the optimum probability distribution with which the messages should be delivered so that the average redundancy of the source is minimized. Here, we have taken the case of various generalized mean codeword lengths. Moreover, the upper bound to these codeword lengths has been found for the case of Huffman encoding.展开更多
文摘In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results from the neighbouring Huffman coded bits. Simulations demonstrate that in the presence of source redundancy, the proposed algorithm gives better performance than the Separate Source and Channel Decoding algorithm (SSCD).
文摘With over 10 million points of genetic variation from person to person, every individual’s genome is unique and provides a highly reliable form of identification. This is because the genetic code is specific to each individual and does not change over time. Genetic information has been used to identify individuals in a variety of contexts, such as criminal investigations, paternity tests, and medical research. In this study, each individual’s genetic makeup has been formatted to create a secure, unique code that incorporates various elements, such as species, gender, and the genetic identification code itself. The combinations of markers required for this code have been derived from common single nucleotide polymorphisms (SNPs), points of variation found in the human genome. The final output is in the form of a 24 numerical code with each number having three possible combinations. The custom code can then be utilized to create various modes of identification on the decentralized blockchain network as well as personalized services and products that offer users a novel way to uniquely identify themselves in ways that were not possible before.
基金Project (No. 60172030) partially supported by the National Natural Science Foundation of China
文摘Reversible variable length codes (RVLCs) have received much attention due to their excellent error resilient capabilities. In this paper, a novel construction algorithm for symmetrical RVLC is proposed which is independent of the Huffman code. The proposed algorithm's codeword assignment is only based on symbol occurrence probability. It has many advantages over symmetrical construction algorithms available for easy realization and better code performance. In addition, the proposed algorithm simplifies the codeword selection mechanism dramatically.
文摘文章提出一种在片上系统(System on Chip,SoC)实现高吞吐率的有限状态熵编码(finite state entropy,FSE)算法。通过压缩率、速度、资源消耗、功耗4个方面对所提出的编码器和解码器与典型的硬件哈夫曼编码(Huffman coding,HC)进行性能比较,结果表明,所提出的硬件FSE编码器和解码器具有显著优势。硬件FSE(hFSE)架构实现在SoC的处理系统和可编程逻辑块(programmable logic,PL)上,通过高级可扩展接口(Advanced eXtensible Interface 4,AXI4)总线连接SoC的处理系统和可编程逻辑块。算法测试显示,FSE算法在非均匀数据分布和大数据量情况下,具有更好的压缩率。该文设计的编码器和解码器已在可编程逻辑块上实现,其中包括1个可配置的缓冲模块,将比特流作为单字节或双字节配置输出到8 bit位宽4096深度或16 bit位宽2048深度的块随机访问存储器(block random access memory,BRAM)中。所提出的FSE硬件架构为实时压缩应用提供了高吞吐率、低功耗和低资源消耗的硬件实现。
文摘Data compression plays a key role in optimizing the use of memory storage space and also reducing latency in data transmission. In this paper, we are interested in lossless compression techniques because their performance is exploited with lossy compression techniques for images and videos generally using a mixed approach. To achieve our intended objective, which is to study the performance of lossless compression methods, we first carried out a literature review, a summary of which enabled us to select the most relevant, namely the following: arithmetic coding, LZW, Tunstall’s algorithm, RLE, BWT, Huffman coding and Shannon-Fano. Secondly, we designed a purposive text dataset with a repeating pattern in order to test the behavior and effectiveness of the selected compression techniques. Thirdly, we designed the compression algorithms and developed the programs (scripts) in Matlab in order to test their performance. Finally, following the tests conducted on relevant data that we constructed according to a deliberate model, the results show that these methods presented in order of performance are very satisfactory:- LZW- Arithmetic coding- Tunstall algorithm- BWT + RLELikewise, it appears that on the one hand, the performance of certain techniques relative to others is strongly linked to the sequencing and/or recurrence of symbols that make up the message, and on the other hand, to the cumulative time of encoding and decoding.
基金Shanghai Science & Technology Development Foundation (022512065) and Shanghai Construction Technology Development Foundation (A0206101).
文摘In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference be-tween them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortifi-cation criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The re-sults show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period re-sponse spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.
文摘Long non coding RNAs(lncRNAs) are non-protein or low-protein coding transcripts that contain more than 200 nucleotides. They representing a large share of the cell’s transcriptional output, demonstrate functional attributes viz. tissue-specific expression, determination of cell fate, controlled expression, RNA processing and editing, dosage compensation, genomic imprinting, conserved evolutionary traits etc. These long non coding variants are well associated with pathogenicity of various diseases including the neurological disorders like Alzheimer’s disease, schizophrenia, Huntington’s disease, Parkinson’s disease etc. Neurological disorders are widespread and there knowing the underlying mechanisms become crucial. The lncRNAs take part in the pathogenesis by a plethora of mechanisms like decoy, scaffold, mi-RNA sequestrator, histone modifiers and in transcriptional interference. Detailed knowledge of the role of lncRNAs can help to use them further as novel biomarkers for therapeutic aspects. Here, in this review we discuss regulation and functional roles of lncRNAs in eight neurological diseases and psychiatric disorders, and the mechanisms by which they act. With these, we try to establish their roles as potential markers and viable diagnostic tools in these disorders.
文摘In Systems Biology, system identification, which infers regulatory network in genetic system and metabolic pathways using experimentally observed time-course data, is one of the hottest issues. The efficient numerical optimization algorithm to estimate more than 100 real-coded parameters should be developed for this purpose. New real-coded genetic algorithm (RCGA), the combination of AREX (adaptive real-coded ensemble crossover) with JGG (just generation gap), have applied to the inference of genetic interactions involving more than 100 parameters related to the interactions with using experimentally observed time-course data. Compared with conventional RCGA, the combination of UNDX (unimodal normal distribution crossover) with MGG (minimal generation gap), new algorithm has shown the superiority with improving early convergence in the first stage of search and suppressing evolutionary stagnation in the last stage of search.
文摘In the present communication, we have obtained the optimum probability distribution with which the messages should be delivered so that the average redundancy of the source is minimized. Here, we have taken the case of various generalized mean codeword lengths. Moreover, the upper bound to these codeword lengths has been found for the case of Huffman encoding.