期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Novel Lossless Compression Method Based on the Fourier Transform to Approximate the Kolmogorov Complexity of Elementary Cellular Automata
1
作者 Mohammed Terry-Jack 《Journal of Software Engineering and Applications》 2022年第10期359-383,共25页
We propose a novel, lossless compression algorithm, based on the 2D Discrete Fast Fourier Transform, to approximate the Algorithmic (Kolmogorov) Complexity of Elementary Cellular Automata. Fast Fourier transforms are ... We propose a novel, lossless compression algorithm, based on the 2D Discrete Fast Fourier Transform, to approximate the Algorithmic (Kolmogorov) Complexity of Elementary Cellular Automata. Fast Fourier transforms are widely used in image compression but their lossy nature exclude them as viable candidates for Kolmogorov Complexity approximations. For the first time, we present a way to adapt fourier transforms for lossless image compression. The proposed method has a very strong Pearsons correlation to existing complexity metrics and we further establish its consistency as a complexity metric by confirming its measurements never exceed the complexity of nothingness and randomness (representing the lower and upper limits of complexity). Surprisingly, many of the other methods tested fail this simple sanity check. A final symmetry-based test also demonstrates our method’s superiority over existing lossless compression metrics. All complexity metrics tested, as well as the code used to generate and augment the original dataset, can be found in our github repository: ECA complexity metrics<sup>1</sup>. 展开更多
关键词 Fast Fourier Transform Lossless Compression Elementary Cellular Automata Algorithmic Information Theory kolmogorov complexity
下载PDF
Average-Case Analysis of Algorithms UsingKolmogorov Complexity
2
作者 姜涛 李明 《Journal of Computer Science & Technology》 SCIE EI CSCD 2000年第5期402-408,共7页
Analyzing the average-case complexity of algorithms is a very practical but very difficult problem in computer science. In the past few years I we have demonstrated that Kolmogorov complexity is an important tool for... Analyzing the average-case complexity of algorithms is a very practical but very difficult problem in computer science. In the past few years I we have demonstrated that Kolmogorov complexity is an important tool for analyzing the average-case complexity of algorithms. We have developed the incompressibility method. In this paper, several simple examples are used to further demonstrate the power and simplicity of such method. We prove bounds on the average-case number of stacks (queues) required for sorting sequential or parallel Queuesort or Stacksort. 展开更多
关键词 kolmogorov complexity ALGORITHM average-case analysis SORTING
原文传递
A New Approach for Multi-Document Update Summarization 被引量:2
3
作者 龙翀 黄民烈 +1 位作者 朱小燕 李明 《Journal of Computer Science & Technology》 SCIE EI CSCD 2010年第4期739-749,共11页
Fast changing knowledge on the Internet can be acquired more efficiently with the help of automatic document summarization and updating techniques. This paper describes a novel approach for multi-document update summa... Fast changing knowledge on the Internet can be acquired more efficiently with the help of automatic document summarization and updating techniques. This paper describes a novel approach for multi-document update summarization. The best summary is defined to be the one which has the minimum information distance to the entire document set. The best update summary has the minimum conditional information distance to a document cluster given that a prior document cluster has already been read. Experiments on the DUC/TAC 2007 to 2009 datasets (http://duc.nist.gov/, http://www.nist.gov/tac/) have proved that our method closely correlates with the human summaries and outperforms other programs such as LexRank in many categories under the ROUGE evaluation criterion. 展开更多
关键词 data mining text mining kolmogorov complexity information distance
原文传递
The new AI is general and mathematically rigorous
4
作者 Jurgen SCHMIDHUBER 《Frontiers of Electrical and Electronic Engineering in China》 CSCD 2010年第3期347-362,共16页
Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically ... Most traditional artificial intelligence(AI)systems of the past decades are either very limited,or based on heuristics,or both.The new millennium,however,has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction,search,inductive inference based on Occam’s razor,problem solving,decision making,and reinforcement learning in environments of a very general type.Since inductive inference is at the heart of all inductive sciences,some of the results are relevant not only for AI and computer science but also for physics,provoking nontraditional predictions based on Zuse’s thesis of the computer-generated universe.We first briefly review the history of AI since Godel’s 1931 paper,then discuss recent post-2000 approaches that are currently transforming general AI research into a formal science. 展开更多
关键词 prediction search inductive inference Occam’s razor Speed Prior super-Omega limitcomputability generalizations of kolmogorov complexity digital physics optimal universal problem solvers Godel machine artificial creativity and curiosity artificial intelligence(AI)as a formal science
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部