期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Shannon Entropy as a Measurement of the Information in a Multiconfiguration Dirac-Fock Wavefunction
1
作者 万建杰 《Chinese Physics Letters》 SCIE CAS CSCD 2015年第2期52-55,共4页
Discrete Shannon entropy is applied to describe the information in a multiconfiguration Dirac Fock wavefunction. The dependence of Shannon entropy is shown as enlarging the configuration space and it can reach saturat... Discrete Shannon entropy is applied to describe the information in a multiconfiguration Dirac Fock wavefunction. The dependence of Shannon entropy is shown as enlarging the configuration space and it can reach saturation when there are enough configuration state wavefunctions to obtain the convergent energy levels; that is, the calculation procedure in multiconfiguration Dirae Fock method is an entropy saturation process. At the same accuracy level, the basis sets for the smallest entropy are best able to describe the energy state. Additionally, a connection between the sudden change of Shannon information entropies and energy level crossings along with isoelectronic sequence can be set up, which is helpful to find the energy level crossings of interest in interpreting and foreseeing the inversion scheme of energy levels for an x-ray laser. 展开更多
关键词 shannon entropy as a Measurement of the information in a Multiconfiguration Dirac-Fock Wavefunction
下载PDF
Shannon information entropies for position-dependent mass Schrdinger problem with a hyperbolic well
2
作者 Sun Guo-Hua Duan Popov +1 位作者 Oscar Camacho-Nieto Dong Shi-Hai 《Chinese Physics B》 SCIE EI CAS CSCD 2015年第10期45-52,共8页
The Shannon information entropy for the Schrodinger equation with a nonuniform solitonic mass is evaluated for a hyperbolic-type potential. The number of nodes of the wave functions in the transformed space z are brok... The Shannon information entropy for the Schrodinger equation with a nonuniform solitonic mass is evaluated for a hyperbolic-type potential. The number of nodes of the wave functions in the transformed space z are broken when recovered to original space x. The position Sx and momentum S p information entropies for six low-lying states are calculated. We notice that the Sx decreases with the increasing mass barrier width a and becomes negative beyond a particular width a,while the Sp first increases with a and then decreases with it. The negative Sx exists for the probability densities that are highly localized. We find that the probability density ρ(x) for n = 1, 3, 5 are greater than 1 at position x = 0. Some interesting features of the information entropy densities ρs(x) and ρs(p) are demonstrated. The Bialynicki-Birula-Mycielski(BBM)inequality is also tested for these states and found to hold. 展开更多
关键词 position-dependent mass shannon information entropy hyperbolic potential Fourier transform
下载PDF
Dynamics of Information Entropies of Atom-Field Entangled States Generated via the Jaynes-Cummings Model
3
作者 R.Pakniat M.K.Tavassoly M.H.Zandi 《Communications in Theoretical Physics》 SCIE CAS CSCD 2016年第3期266-272,共7页
In this paper we have studied the dynamical evolution of Shannon information entropies in position and momentum spaces for two classes of(nonstationary) atom-field entangled states,which are obtained via the JaynesC... In this paper we have studied the dynamical evolution of Shannon information entropies in position and momentum spaces for two classes of(nonstationary) atom-field entangled states,which are obtained via the JaynesCummings model and its generalization.We have focused on the interaction between two- and(1)-type three-level atoms with the single-mode quantized field.The three-dimensional plots of entropy densities in position and momentum spaces are presented versus corresponding coordinates and time,numerically.It is observed that for particular values of the parameters of the systems,the entropy squeezing in position space occurs.Finally,we have shown that the well-known BBM(Beckner,Bialynicki-Birola and Mycielsky) inequality,which is a stronger statement of the Heisenberg uncertainty relation,is properly satisfied. 展开更多
关键词 shannon information entropy entropy squeezing BBM inequality Jaynes-Cummings model
原文传递
Dynamic statistical information theory 被引量:3
4
作者 XING Xiusan 《Science China(Physics,Mechanics & Astronomy)》 SCIE EI CAS 2006年第1期1-37,共37页
In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and d... In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fok- ker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dy- namic entropy density and dynamic information density and the nonlinear evolution equa- tions of Boltzmann dynamic entropy density and dynamic information density, that de- scribe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic infor- mation densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and in- formation have been combined with the state and its law of motion of the systems. Fur- thermore we presented the formulas of two kinds of entropy production rates and infor- mation dissipation rates, the expressions of two kinds of drift information flows and diffu- sion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy produc- tion rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynamic dissipation characteristics in the transmission processes, which change into their maximum—the present static mutual information and static channel capacity under the limit case where the proportion of channel length to informa- tion transmission rate approaches to zero. All these unified and rigorous theoretical for- mulas and results are derived from the evolution equations of dynamic information and dynamic entropy without adding any extra assumption. In this review, we give an overview on the above main ideas, methods and results, and discuss the similarity and difference between two kinds of dynamic statistical information theories. 展开更多
关键词 evolution equation of shannon information (entropy) evolution equation of Boltzmann informa-tion (entropy) information (entropy) flow information (entropy) diffusion entropy production rate informa- tion dissipation rate dynamic mutual infomation dynamic chamnel capacity.
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部