We propose a novel, lossless compression algorithm, based on the 2D Discrete Fast Fourier Transform, to approximate the Algorithmic (Kolmogorov) Complexity of Elementary Cellular Automata. Fast Fourier transforms are ...We propose a novel, lossless compression algorithm, based on the 2D Discrete Fast Fourier Transform, to approximate the Algorithmic (Kolmogorov) Complexity of Elementary Cellular Automata. Fast Fourier transforms are widely used in image compression but their lossy nature exclude them as viable candidates for Kolmogorov Complexity approximations. For the first time, we present a way to adapt fourier transforms for lossless image compression. The proposed method has a very strong Pearsons correlation to existing complexity metrics and we further establish its consistency as a complexity metric by confirming its measurements never exceed the complexity of nothingness and randomness (representing the lower and upper limits of complexity). Surprisingly, many of the other methods tested fail this simple sanity check. A final symmetry-based test also demonstrates our method’s superiority over existing lossless compression metrics. All complexity metrics tested, as well as the code used to generate and augment the original dataset, can be found in our github repository: ECA complexity metrics<sup>1</sup>.展开更多
Analyzing the average-case complexity of algorithms is a very practical but very difficult problem in computer science. In the past few years I we have demonstrated that Kolmogorov complexity is an important tool for...Analyzing the average-case complexity of algorithms is a very practical but very difficult problem in computer science. In the past few years I we have demonstrated that Kolmogorov complexity is an important tool for analyzing the average-case complexity of algorithms. We have developed the incompressibility method. In this paper, several simple examples are used to further demonstrate the power and simplicity of such method. We prove bounds on the average-case number of stacks (queues) required for sorting sequential or parallel Queuesort or Stacksort.展开更多
The aim of this study is to identify the functions and states of the brains according to the values of the complexity measure of the EEG signals. The EEG signals of 30 normal samples and 30 patient samples are collect...The aim of this study is to identify the functions and states of the brains according to the values of the complexity measure of the EEG signals. The EEG signals of 30 normal samples and 30 patient samples are collected. Based on the preprocessing for the raw data, a computational program for complexity measure is compiled and the complexity measures of all samples are calculated. The mean value and standard error of complexity measure of control group is as 0.33 and 0.10, and the normal group is as 0.53 and 0.08. When the confidence degree is 0.05, the confidence interval of the normal population mean of complexity measures for the control group is (0.2871,0.3652), and (0.4944,0.5552) for the normal group. The statistic results show that the normal samples and patient samples can be clearly distinguished by the value of measures. In clinical medicine, the results can be used to be a reference to evaluate the function or state, to diagnose disease, to monitor the rehabilitation progress of the brain.展开更多
Fast changing knowledge on the Internet can be acquired more efficiently with the help of automatic document summarization and updating techniques. This paper describes a novel approach for multi-document update summa...Fast changing knowledge on the Internet can be acquired more efficiently with the help of automatic document summarization and updating techniques. This paper describes a novel approach for multi-document update summarization. The best summary is defined to be the one which has the minimum information distance to the entire document set. The best update summary has the minimum conditional information distance to a document cluster given that a prior document cluster has already been read. Experiments on the DUC/TAC 2007 to 2009 datasets (http://duc.nist.gov/, http://www.nist.gov/tac/) have proved that our method closely correlates with the human summaries and outperforms other programs such as LexRank in many categories under the ROUGE evaluation criterion.展开更多
Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically ...Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction,search,inductive inference based on Occam’s razor,problem solving,decision making,and reinforcement learning in environments of a very general type.Since inductive inference is at the heart of all inductive sciences,some of the results are relevant not only for AI and computer science but also for physics,provoking nontraditional predictions based on Zuse’s thesis of the computer-generated universe.We first briefly review the history of AI since Godel’s 1931 paper,then discuss recent post-2000 approaches that are currently transforming general AI research into a formal science.展开更多
文摘We propose a novel, lossless compression algorithm, based on the 2D Discrete Fast Fourier Transform, to approximate the Algorithmic (Kolmogorov) Complexity of Elementary Cellular Automata. Fast Fourier transforms are widely used in image compression but their lossy nature exclude them as viable candidates for Kolmogorov Complexity approximations. For the first time, we present a way to adapt fourier transforms for lossless image compression. The proposed method has a very strong Pearsons correlation to existing complexity metrics and we further establish its consistency as a complexity metric by confirming its measurements never exceed the complexity of nothingness and randomness (representing the lower and upper limits of complexity). Surprisingly, many of the other methods tested fail this simple sanity check. A final symmetry-based test also demonstrates our method’s superiority over existing lossless compression metrics. All complexity metrics tested, as well as the code used to generate and augment the original dataset, can be found in our github repository: ECA complexity metrics<sup>1</sup>.
文摘Analyzing the average-case complexity of algorithms is a very practical but very difficult problem in computer science. In the past few years I we have demonstrated that Kolmogorov complexity is an important tool for analyzing the average-case complexity of algorithms. We have developed the incompressibility method. In this paper, several simple examples are used to further demonstrate the power and simplicity of such method. We prove bounds on the average-case number of stacks (queues) required for sorting sequential or parallel Queuesort or Stacksort.
基金International Joint Research Program from the Ministry of Science and Technology of Chinagrant number:20070667+1 种基金Education Commission of Chongqing of Chinagrant number:KJ081209
文摘The aim of this study is to identify the functions and states of the brains according to the values of the complexity measure of the EEG signals. The EEG signals of 30 normal samples and 30 patient samples are collected. Based on the preprocessing for the raw data, a computational program for complexity measure is compiled and the complexity measures of all samples are calculated. The mean value and standard error of complexity measure of control group is as 0.33 and 0.10, and the normal group is as 0.53 and 0.08. When the confidence degree is 0.05, the confidence interval of the normal population mean of complexity measures for the control group is (0.2871,0.3652), and (0.4944,0.5552) for the normal group. The statistic results show that the normal samples and patient samples can be clearly distinguished by the value of measures. In clinical medicine, the results can be used to be a reference to evaluate the function or state, to diagnose disease, to monitor the rehabilitation progress of the brain.
基金supported by the National Natural Science Foundation of China under Grant No.60973104the National Basic Research 973 Program of China under Grant No.2007CB311003the IRCI Project from IDRC,Canada
文摘Fast changing knowledge on the Internet can be acquired more efficiently with the help of automatic document summarization and updating techniques. This paper describes a novel approach for multi-document update summarization. The best summary is defined to be the one which has the minimum information distance to the entire document set. The best update summary has the minimum conditional information distance to a document cluster given that a prior document cluster has already been read. Experiments on the DUC/TAC 2007 to 2009 datasets (http://duc.nist.gov/, http://www.nist.gov/tac/) have proved that our method closely correlates with the human summaries and outperforms other programs such as LexRank in many categories under the ROUGE evaluation criterion.
文摘Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction,search,inductive inference based on Occam’s razor,problem solving,decision making,and reinforcement learning in environments of a very general type.Since inductive inference is at the heart of all inductive sciences,some of the results are relevant not only for AI and computer science but also for physics,provoking nontraditional predictions based on Zuse’s thesis of the computer-generated universe.We first briefly review the history of AI since Godel’s 1931 paper,then discuss recent post-2000 approaches that are currently transforming general AI research into a formal science.