期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
On Clausius, Boltzmann and Shannon Notions of Entropy 被引量:1
1
作者 Francisco Balibrea 《Journal of Modern Physics》 2016年第2期219-227,共9页
Discrete dynamical systems are given by the pair (X,f) where X is a compact metric space and f: X→X is a continuous map. During years, a long list of results have appeared to precise and understand what is the comple... Discrete dynamical systems are given by the pair (X,f) where X is a compact metric space and f: X→X is a continuous map. During years, a long list of results have appeared to precise and understand what is the complexity of the systems. Among them, one of the most popular is that of topological entropy. In modern applications, other conditions on X and f have been considered. For example, X can be non-compact or f can be discontinuous (only in a finite number of points and with bounded jumps on the values of f or even non-bounded jumps). Such systems are interesting from theoretical point of view in Topological Dynamics and appear frequently in applied sciences such as Electronics and Control Theory. In this paper, we are reviewing the origins of the notion of entropy and studying some developing of it leading to modern notions of entropies. At the same time, we will incorporate some mathematical foundations of such old and new ideas until the appearance of Shannon entropy. To this end, we start with the introduction for the first time of the notion of entropy in thermodynamics by R. Clausius and its evolution by L. Boltzmann until the appearing in the twenty century of Shannon and Kolmogorov-Sinai entropies and the subsequent topological entropy. In turn, such notions have evolved to other recent situations where it is necessary to give some extended versions of them adapted to new problems. Of special interest is to appreciate the connexions of the notions of entropy from Boltzmann and Shannon. Since this history is long, we will not deal with the Kolmogorov-Sinai entropy or with topological entropy and modern approaches. 展开更多
关键词 CLAUSIUS boltzmann and Shannon Entropies Information Theory
下载PDF
Information Theory of Cartography:An Information-theoretic Framework for Cartographic Communication 被引量:11
2
作者 Zhilin LI Peichao GAO Zhu XU 《Journal of Geodesy and Geoinformation Science》 2021年第1期1-16,共16页
Map is one of the communication means created by human being.Cartographers have been making efforts on the comparison of maps to natural languages so as to establish a"cartographic language"or"map langu... Map is one of the communication means created by human being.Cartographers have been making efforts on the comparison of maps to natural languages so as to establish a"cartographic language"or"map language".One of such efforts is to adopt the Shannon’s Information Theory originated in digital communication into cartography so as to establish an entropy-based cartographic communication theory.However,success has been very limited although research work had started as early as the mid-1960 s.It is then found that the bottleneck problem was the lack of appropriate measures for the spatial(configurational)information of(graphic and image)maps,as the classic Shannon entropy is only capable of characterizing statistical information but fails to capture the configurational information of(graphic and image)maps.Fortunately,after over 40-year development,some bottleneck problems have been solved.More precisely,generalized Shannon entropies for metric and thematic information of(graphic)maps have been developed and the first feasible solution for computing the Boltzmann entropy of image maps has been invented,which is capable of measuring the spatial information of not only numerical images but also categorical maps.With such progress,it is now feasible to build the"Information Theory of Cartography".In this paper,a framework for such a theory is proposed and some key issues are identified.For these issues,some have already been tackled while others still need efforts.As a result,a research agenda is set for future action.After all these issues are tackled,the theory will become matured so as to become a theoretic basis of cartography.It is expected that the Information Theory of Cartography will play an increasingly important role in the discipline of cartography because more and more researchers have advocated that information is more fundamental than matter and energy. 展开更多
关键词 Information Theory of Cartography cartographic communication spatial information of maps generalized Shannon entropy boltzmann entropy
下载PDF
Two-Particle Boltzmann H-theorem
3
作者 Burenmandula Ying-chun ZHAO AQILALTU 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2015年第3期747-756,共10页
The celebrated H-theorem of Boltzmann has important physical significance. The H-theorem states that entropy cannot diminish, and that the distribution function f(z, t) must tend towards its equilibrium state.In thi... The celebrated H-theorem of Boltzmann has important physical significance. The H-theorem states that entropy cannot diminish, and that the distribution function f(z, t) must tend towards its equilibrium state.In this paper, using the relationship between solutions of Boltzmann equation and the two-particle Boltzmann equation system, we obtain three forms of Two-Particle Boltzmann H-theorem from the two-particle Boltzmann equation system of BBGGKY hierarchy, and give an application example for the Two-Particle Boltzmann H-theorem. Also, the relation between entropy and the Two-Particle Boltzmann H-theorem is obtained. 展开更多
关键词 two-particle boltzmann system two-Particle boltzmann H-theorem Boltzmnnn H-theorem entropy
原文传递
Dynamic statistical information theory 被引量:3
4
作者 XING Xiusan 《Science China(Physics,Mechanics & Astronomy)》 SCIE EI CAS 2006年第1期1-37,共37页
In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and d... In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fok- ker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dy- namic entropy density and dynamic information density and the nonlinear evolution equa- tions of Boltzmann dynamic entropy density and dynamic information density, that de- scribe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic infor- mation densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and in- formation have been combined with the state and its law of motion of the systems. Fur- thermore we presented the formulas of two kinds of entropy production rates and infor- mation dissipation rates, the expressions of two kinds of drift information flows and diffu- sion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy produc- tion rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum—the present static mutual information and static channel capacity under the limit case where the proportion of channel length to informa- tion transmission rate approaches to zero. All these unified and rigorous theoretical for- mulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories. 展开更多
关键词 evolution equation of Shannon information (entropy) evolution equation of boltzmann informa-tion (entropy) information (entropy) flow information (entropy) diffusion entropy production rate informa- tion dissipation rate dynamic mutual infomation dynamic chamnel capacity.
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部