Map is one of the communication means created by human being.Cartographers have been making efforts on the comparison of maps to natural languages so as to establish a"cartographic language"or"map langu...Map is one of the communication means created by human being.Cartographers have been making efforts on the comparison of maps to natural languages so as to establish a"cartographic language"or"map language".One of such efforts is to adopt the Shannon’s Information Theory originated in digital communication into cartography so as to establish an entropy-based cartographic communication theory.However,success has been very limited although research work had started as early as the mid-1960 s.It is then found that the bottleneck problem was the lack of appropriate measures for the spatial(configurational)information of(graphic and image)maps,as the classic Shannon entropy is only capable of characterizing statistical information but fails to capture the configurational information of(graphic and image)maps.Fortunately,after over 40-year development,some bottleneck problems have been solved.More precisely,generalized Shannon entropies for metric and thematic information of(graphic)maps have been developed and the first feasible solution for computing the Boltzmann entropy of image maps has been invented,which is capable of measuring the spatial information of not only numerical images but also categorical maps.With such progress,it is now feasible to build the"Information Theory of Cartography".In this paper,a framework for such a theory is proposed and some key issues are identified.For these issues,some have already been tackled while others still need efforts.As a result,a research agenda is set for future action.After all these issues are tackled,the theory will become matured so as to become a theoretic basis of cartography.It is expected that the Information Theory of Cartography will play an increasingly important role in the discipline of cartography because more and more researchers have advocated that information is more fundamental than matter and energy.展开更多
Information based models for radiation emitted by a Black Body which passes through a scattering medium are analyzed. In the limit, when there is no scattering this model reverts to the Black Body Radiation Law. The a...Information based models for radiation emitted by a Black Body which passes through a scattering medium are analyzed. In the limit, when there is no scattering this model reverts to the Black Body Radiation Law. The advantage of this mathematical model is that it includes the effect of the scattering of the radiation between source and detector. In the case when the exact form of the scattering mechanism is not known a model using a single scattering parameter is derived. A simple version of this model is derived which is useful for analyzing large data.展开更多
The increasing dependence on data highlights the need for a detailed understanding of its behavior,encompassing the challenges involved in processing and evaluating it.However,current research lacks a comprehensive st...The increasing dependence on data highlights the need for a detailed understanding of its behavior,encompassing the challenges involved in processing and evaluating it.However,current research lacks a comprehensive structure for measuring the worth of data elements,hindering effective navigation of the changing digital environment.This paper aims to fill this research gap by introducing the innovative concept of“data components.”It proposes a graphtheoretic representation model that presents a clear mathematical definition and demonstrates the superiority of data components over traditional processing methods.Additionally,the paper introduces an information measurement model that provides a way to calculate the information entropy of data components and establish their increased informational value.The paper also assesses the value of information,suggesting a pricing mechanism based on its significance.In conclusion,this paper establishes a robust framework for understanding and quantifying the value of implicit information in data,laying the groundwork for future research and practical applications.展开更多
Synthetic aperture radar (SAR) is portrayed as a multiple access channel. An information theory approach is applied to the SAR imaging system, and the information content about a target that can be extracted from its ...Synthetic aperture radar (SAR) is portrayed as a multiple access channel. An information theory approach is applied to the SAR imaging system, and the information content about a target that can be extracted from its radar image is evaluated by the average mutual information measure. A conditional (transition) probability density function (PDF) of the SAR imaging system is derived by analyzing the system and a closed form of the information content is found. It is shown that the information content obtained by the SAR imaging system from an independent sample of echoes will decrease and the total information content obtained by the SAR imaging system will increase with an increase in the number of looks. Because the total average mutual information is also used to define a measure of radiometric resolution for radar images, it is shown that the radiometric resolution of a radar image of terrain will be improved by spatial averaging. In addition, the imaging process and the data compression process for SAR are each treated as an independent generalized communication channel. The effects of data compression upon radiometric resolution for SAR are studied and some conclusions are obtained.展开更多
This paper introduces computer vision from an information theory perspective.We discuss how vision can be thought of as a decoding problem where the goal is to find the most efficient encoding of the visual scene.This...This paper introduces computer vision from an information theory perspective.We discuss how vision can be thought of as a decoding problem where the goal is to find the most efficient encoding of the visual scene.This requires probabilistic models which are capable of capturing the complexity and ambiguities of natural images.We start by describing classic Markov Random Field(MRF)models of images.We stress the importance of having efficient inference and learning algorithms for these models and emphasize those approaches which use concepts from information theory.Next we introduce more powerful image models that have recently been developed and which are better able to deal with the complexities of natural images.These models use stochastic grammars and hierarchical representations.They are trained using images from increasingly large databases.Finally,we described how techniques from information theory can be used to analyze vision models and measure the effectiveness of different visual cues.展开更多
Future communication systems will include di erent types of messages requiring di erent transmission rates,packet lengths,and service qualities.We address the power-optimization issues of communication systems conveyi...Future communication systems will include di erent types of messages requiring di erent transmission rates,packet lengths,and service qualities.We address the power-optimization issues of communication systems conveying multiple message types based on nite-delay information theory.Given both the normalized transmission rate and the packet length of a system,the actual residual decoding error rate is a function of the transmission power.We propose a generalized power allocation framework for multiple message types.Two di erent optimization cost functions are adopted:the number of service-quality violations encountered and the sum log ratio of the residual decoding error rate.We provide the optimal analytical solution for the former cost function and a heuristic solution based on a genetic algorithm for the latter one.Finally,the performance of the proposed solutions are evaluated numerically.展开更多
Although numerous advances have been made in information technology in the past decades,there is still a lack of progress in information systems dynamics(ISD),owing to the lack of a mathematical foundation needed to d...Although numerous advances have been made in information technology in the past decades,there is still a lack of progress in information systems dynamics(ISD),owing to the lack of a mathematical foundation needed to describe information and the lack of an analytical framework to evaluate information systems.The value of ISD lies in its ability to guide the design,development,application,and evaluation of largescale information system-of-systems(So Ss),just as mechanical dynamics theories guide mechanical systems engineering.This paper reports on a breakthrough in these fundamental challenges by proposing a framework for information space,improving a mathematical theory for information measurement,and proposing a dynamic configuration model for information systems.In this way,it establishes a basic theoretical framework for ISD.The proposed theoretical methodologies have been successfully applied and verified in the Smart Court So Ss Engineering Project of China and have achieved significant improvements in the quality and efficiency of Chinese court informatization.The proposed ISD provides an innovative paradigm for the analysis,design,development,and evaluation of large-scale complex information systems,such as electronic government and smart cities.展开更多
This research aims to integrate Bekenstein’s bound and Landauer’s principle, providing a unified framework to understand the limits of information and energy in physical systems. By combining these principles, we ex...This research aims to integrate Bekenstein’s bound and Landauer’s principle, providing a unified framework to understand the limits of information and energy in physical systems. By combining these principles, we explore the implications for black hole thermodynamics, astrophysics, astronomy, information theory, and the search for new laws of nature. The result includes an estimation of the number of bits stored in a black hole (less than 1.4 × 10<sup>30</sup> bits/m<sup>3</sup>), enhancing our understanding of information storage in extreme gravitational environments. This integration offers valuable insights into the fundamental nature of information and energy, impacting scientific advancements in multiple disciplines.展开更多
Both consciousness and quantum phenomenon are subjective and indeterministic. In this paper, we propose consciousness is a quantum phenomenon. A quantum theory of consciousness (QTOC) is presented based on a new inter...Both consciousness and quantum phenomenon are subjective and indeterministic. In this paper, we propose consciousness is a quantum phenomenon. A quantum theory of consciousness (QTOC) is presented based on a new interpretation of quantum physics. We show that this QTOC can address the mind and body problem, the hard problem of consciousness. It also provides a physics foundation and mathematical formulation to study consciousness and neural network. We demonstrate how to apply it to develop and extend various models of consciousness. We show the predictions from this theory about the existence of a universal quantum vibrational field and the large-scale, nearly instantaneous synchrony of brainwaves among different parts of brain, body, people, and objects. The correlation between Schumann Resonances and some brainwaves is explained. Recent progress in quantum information theory, especially regarding quantum entanglement and quantum error correction code, is applied to study memory and shed new light in neuroscience.展开更多
Since the 1970s, the application of information processing and information technology has gradually played a dominant role in Western economy, especially in American economy. Since the mid1980s, investment of the Unit...Since the 1970s, the application of information processing and information technology has gradually played a dominant role in Western economy, especially in American economy. Since the mid1980s, investment of the United States, Europe and Japan in information technology has increased drastically, making information展开更多
“Cantonese Cuisine Master”project is an important policy proposed by China to inherit Cantonese Cuisine culture,promote employment,and achieve targeted poverty reduction and rural revitalization.Confronted with the ...“Cantonese Cuisine Master”project is an important policy proposed by China to inherit Cantonese Cuisine culture,promote employment,and achieve targeted poverty reduction and rural revitalization.Confronted with the demands of more diverse education,it is an essential opportunity and task for the education system to consider how to construct high-quality online courses and pursue higher-quality“Cantonese Cuisine Master”projects in line with the new era.This paper,based on the theory of instructional media and information processing theory,will further clarify the demand,dilemma,and developing strategy of online course construction for culinary majors,and explore its construction and practice with the example of“A Bite of Teochew Cuisine,”a Guangdong first-class course.展开更多
Vocabulary acquisition is an intricate process,which has a close relationship with memory.In cognitive psychology,a large number of studies on memory system have been conducted based on the information processing theo...Vocabulary acquisition is an intricate process,which has a close relationship with memory.In cognitive psychology,a large number of studies on memory system have been conducted based on the information processing theory,placing great value on second language learners’cognitive process.This study intends to probe into second language vocabulary acquisition from the perspective of information processing theory in hope to help learners acquire vocabulary more scientifically and efficiently.展开更多
This study explores the capabilities of ChatGPT, specifically in relation to consciousness and its performance in the Turing Test. The article begins by examining the diverse perspectives among both the cognitive and ...This study explores the capabilities of ChatGPT, specifically in relation to consciousness and its performance in the Turing Test. The article begins by examining the diverse perspectives among both the cognitive and AI researchers regarding ChatGPT’s ability to pass the Turing Test. It introduces a hierarchical categorization of the test versions, suggesting that ChatGPT approaches success in the test, albeit primarily with na?ve users. Expert users, conversely, can easily identify its limitations. The paper presents various theories of consciousness, with a particular focus on the Integrated Information Theory proposed by Tononi. This theory serves as the framework for assessing ChatGPT’s level of consciousness. Through an evaluation based on the five axioms and theorems of IIT, the study finds that ChatGPT surpasses previous AI systems in certain aspects;however, ChatGPT significantly falls short of achieving a level of consciousness, particularly when compared to biological sentient beings. The paper concludes by emphasizing the importance of recognizing ChatGPT and similar generative AI models as highly advanced and intelligent tools, yet distinctly lacking the consciousness attributes found in advanced living organisms.展开更多
In Quantum Information Theory(QIT) the classical measures of information content in probability distributions are replaced by the corresponding resultant entropic descriptors containing the nonclassical terms generate...In Quantum Information Theory(QIT) the classical measures of information content in probability distributions are replaced by the corresponding resultant entropic descriptors containing the nonclassical terms generated by the state phase or its gradient(electronic current). The classical Shannon(S[p]) and Fisher(I[p]) information terms probe the entropic content of incoherent local events of the particle localization, embodied in the probability distribution p, while their nonclassical phase-companions, S[ Φ ] and I[ Φ ], provide relevant coherence information supplements.Thermodynamic-like couplings between the entropic and energetic descriptors of molecular states are shown to be precluded by the principles of quantum mechanics. The maximum of resultant entropy determines the phase-equilibrium state, defined by "thermodynamic" phase related to electronic density,which can be used to describe reactants in hypothetical stages of a bimolecular chemical reaction.Information channels of molecular systems and their entropic bond indices are summarized, the complete-bridge propagations are examined, and sequential cascades involving the complete sets of the atomic-orbital intermediates are interpreted as Markov chains. The QIT description is applied to reactive systems R = A―B, composed of the Acidic(A) and Basic(B) reactants. The electronegativity equalization processes are investigated and implications of the concerted patterns of electronic flows in equilibrium states of the complementarily arranged substrates are investigated. Quantum communications between reactants are explored and the QIT descriptors of the A―B bond multiplicity/composition are extracted.展开更多
in this paper, a new approach to relativistic information entropy is used to assess some relative uncertainties in structural reliability assessment. This approach is composed of the information theory and the relativ...in this paper, a new approach to relativistic information entropy is used to assess some relative uncertainties in structural reliability assessment. This approach is composed of the information theory and the relativistic theory, and can be used to measure the relativity of parameter uncertainty and system uncertainty in structural reliability theory based on the same generalized relativistic reference system. Therefore, the structural reliability assessment can be assessed reasonably by the approach.展开更多
Based on nonlinear prediction and information theory, vertical heterogeneity of predictability and information loss rate in geopotential height field are obtained over the Northern Hemisphere. On a seasonal-to-interan...Based on nonlinear prediction and information theory, vertical heterogeneity of predictability and information loss rate in geopotential height field are obtained over the Northern Hemisphere. On a seasonal-to-interannual time scale, the predictability is low in the lower troposphere and high in the mid-upper troposphere. However, within mid-upper troposphere over the subtropics ocean area, there is a relatively poor predictability. These conclusions also fit the seasonal time scale. Moving to the interannual time scale, the predictability becomes high in the lower troposphere and low in the mid-upper troposphere, contrary to the former case. On the whole the interannual trend is more predictable than the seasonal trend. The average information loss rate is low over the mid-east Pacific, west of North America, Atlantic and Eurasia, and the atmosphere over other places has a relatively high information loss rate on all-time scales. Two channels are found steadily over the Pacific Ocean and Atlantic Ocean in subtropics. There are also unstable channels. The four- season influence on predictability and information communication are studied. The predictability is low, no matter which season data are removed and each season plays an important role in the existence of the channels, except for the winter. The predictability and teleconnections are paramount issues in atmospheric science, and the teleconnections may be established by communication channels. So, this work is interesting since it reveals the vertical structure of predictability distribution, channel locations, and the contributions of different time scales to them and their variations under different seasons.展开更多
A normalized measure is established to provide the quantitative information about the degree of observability for the discrete-time, stochastically autonomous system. This measure is based on the generalized informati...A normalized measure is established to provide the quantitative information about the degree of observability for the discrete-time, stochastically autonomous system. This measure is based on the generalized information theoretic quantities (generalized entropy, mutual information) of the system state and the observations, where the system state can be a discrete or a continuous random vector. Some important properties are presented. For the linear case, the explicit formula for the degree of observability is derived, and the equivalence between the proposed measure and the traditional rank condition is proved. The curves for the degree of observability are depicted in a simple example.展开更多
We consider a quadratic Gaussian distributed lossy source coding setup with an additional constraint of identical reconstructions between the encoder and the decoder.The setup consists of two correlated Gaussian sourc...We consider a quadratic Gaussian distributed lossy source coding setup with an additional constraint of identical reconstructions between the encoder and the decoder.The setup consists of two correlated Gaussian sources,wherein one of them has to be reconstructed to be within some distortion constraint and match with a corresponding reconstruction at the encoder,while the other source acts as coded side information.We study the tradeoff between the rates of two encoders for a given distortion constraint on the reconstruction.An explicit characterization of this trade-off is the main result of the paper.We also give close inner and outer bounds for the discrete memoryless version of the problem.展开更多
Market data for financial studies typically derives from either historical transactions or contemporaneous surveys of sentiment and perceptions.The research communities analyzing data from these opposing categories of...Market data for financial studies typically derives from either historical transactions or contemporaneous surveys of sentiment and perceptions.The research communities analyzing data from these opposing categories of source data see themselves as distinct,with advantages not shared by the other.This research investigates these latter claims in an information theoretic context,and suggests where methods and controls can be improved.The current research develops a Fisher Information metric for Likert scales,and explores the effect of particular survey design decisions or results on the information content.A Fisher Information metric outperforms earlier metrics by converging reliably to values that are intuitive in the sense that they suggest that information captured from subjects is fairly stable.The results of the analysis suggest that varying bias and response dispersion inherent in specific surveys may require increases of sample size by several orders of magnitude to compensate for information loss and in order to derive valid conclusions at a given significance and power of tests.A prioritization of quality of design,and the factors relevant to survey design are presented in the conclusions,and illustrative examples provide insight and guidance to the assessment of information content in a survey.展开更多
基金National Natural Science Foundation of China(Nos.41930104,41971330)Hong Kong Research Grants Council General Research Fund(No.152219/18E)。
文摘Map is one of the communication means created by human being.Cartographers have been making efforts on the comparison of maps to natural languages so as to establish a"cartographic language"or"map language".One of such efforts is to adopt the Shannon’s Information Theory originated in digital communication into cartography so as to establish an entropy-based cartographic communication theory.However,success has been very limited although research work had started as early as the mid-1960 s.It is then found that the bottleneck problem was the lack of appropriate measures for the spatial(configurational)information of(graphic and image)maps,as the classic Shannon entropy is only capable of characterizing statistical information but fails to capture the configurational information of(graphic and image)maps.Fortunately,after over 40-year development,some bottleneck problems have been solved.More precisely,generalized Shannon entropies for metric and thematic information of(graphic)maps have been developed and the first feasible solution for computing the Boltzmann entropy of image maps has been invented,which is capable of measuring the spatial information of not only numerical images but also categorical maps.With such progress,it is now feasible to build the"Information Theory of Cartography".In this paper,a framework for such a theory is proposed and some key issues are identified.For these issues,some have already been tackled while others still need efforts.As a result,a research agenda is set for future action.After all these issues are tackled,the theory will become matured so as to become a theoretic basis of cartography.It is expected that the Information Theory of Cartography will play an increasingly important role in the discipline of cartography because more and more researchers have advocated that information is more fundamental than matter and energy.
文摘Information based models for radiation emitted by a Black Body which passes through a scattering medium are analyzed. In the limit, when there is no scattering this model reverts to the Black Body Radiation Law. The advantage of this mathematical model is that it includes the effect of the scattering of the radiation between source and detector. In the case when the exact form of the scattering mechanism is not known a model using a single scattering parameter is derived. A simple version of this model is derived which is useful for analyzing large data.
基金supported by the EU H2020 Research and Innovation Program under the Marie Sklodowska-Curie Grant Agreement(Project-DEEP,Grant number:101109045)National Key R&D Program of China with Grant number 2018YFB1800804+2 种基金the National Natural Science Foundation of China(Nos.NSFC 61925105,and 62171257)Tsinghua University-China Mobile Communications Group Co.,Ltd,Joint Institutethe Fundamental Research Funds for the Central Universities,China(No.FRF-NP-20-03)。
文摘The increasing dependence on data highlights the need for a detailed understanding of its behavior,encompassing the challenges involved in processing and evaluating it.However,current research lacks a comprehensive structure for measuring the worth of data elements,hindering effective navigation of the changing digital environment.This paper aims to fill this research gap by introducing the innovative concept of“data components.”It proposes a graphtheoretic representation model that presents a clear mathematical definition and demonstrates the superiority of data components over traditional processing methods.Additionally,the paper introduces an information measurement model that provides a way to calculate the information entropy of data components and establish their increased informational value.The paper also assesses the value of information,suggesting a pricing mechanism based on its significance.In conclusion,this paper establishes a robust framework for understanding and quantifying the value of implicit information in data,laying the groundwork for future research and practical applications.
文摘Synthetic aperture radar (SAR) is portrayed as a multiple access channel. An information theory approach is applied to the SAR imaging system, and the information content about a target that can be extracted from its radar image is evaluated by the average mutual information measure. A conditional (transition) probability density function (PDF) of the SAR imaging system is derived by analyzing the system and a closed form of the information content is found. It is shown that the information content obtained by the SAR imaging system from an independent sample of echoes will decrease and the total information content obtained by the SAR imaging system will increase with an increase in the number of looks. Because the total average mutual information is also used to define a measure of radiometric resolution for radar images, it is shown that the radiometric resolution of a radar image of terrain will be improved by spatial averaging. In addition, the imaging process and the data compression process for SAR are each treated as an independent generalized communication channel. The effects of data compression upon radiometric resolution for SAR are studied and some conclusions are obtained.
基金The author would like to acknowledge funding support from NSF with grants IIS-0917141 and 0613563 and from AFOSR FA9550-08-1-0489.
文摘This paper introduces computer vision from an information theory perspective.We discuss how vision can be thought of as a decoding problem where the goal is to find the most efficient encoding of the visual scene.This requires probabilistic models which are capable of capturing the complexity and ambiguities of natural images.We start by describing classic Markov Random Field(MRF)models of images.We stress the importance of having efficient inference and learning algorithms for these models and emphasize those approaches which use concepts from information theory.Next we introduce more powerful image models that have recently been developed and which are better able to deal with the complexities of natural images.These models use stochastic grammars and hierarchical representations.They are trained using images from increasingly large databases.Finally,we described how techniques from information theory can be used to analyze vision models and measure the effectiveness of different visual cues.
基金This work is supported by the National Key Basic Research Program of China(No.2013CB329201)Key Program of Na-tional Natural Science Foundation of China(No.61631018)+3 种基金Key Research Program of Frontier Sciences of CAS(No.QYZDY-SSW-JSC003)Key Project in Science and Technology of Guangdong Province(No.2014B010119001)Shenzhen Peacock Plan(No.1108170036003286)the Fundamental Research Funds for the Central Universities.
文摘Future communication systems will include di erent types of messages requiring di erent transmission rates,packet lengths,and service qualities.We address the power-optimization issues of communication systems conveying multiple message types based on nite-delay information theory.Given both the normalized transmission rate and the packet length of a system,the actual residual decoding error rate is a function of the transmission power.We propose a generalized power allocation framework for multiple message types.Two di erent optimization cost functions are adopted:the number of service-quality violations encountered and the sum log ratio of the residual decoding error rate.We provide the optimal analytical solution for the former cost function and a heuristic solution based on a genetic algorithm for the latter one.Finally,the performance of the proposed solutions are evaluated numerically.
基金supported by the National Key Research and Development Program of China(2016YFC0800801)the Research and Innovation Project of China University of Political Science and Law(10820356)the Fundamental Research Funds for the Central Universities。
文摘Although numerous advances have been made in information technology in the past decades,there is still a lack of progress in information systems dynamics(ISD),owing to the lack of a mathematical foundation needed to describe information and the lack of an analytical framework to evaluate information systems.The value of ISD lies in its ability to guide the design,development,application,and evaluation of largescale information system-of-systems(So Ss),just as mechanical dynamics theories guide mechanical systems engineering.This paper reports on a breakthrough in these fundamental challenges by proposing a framework for information space,improving a mathematical theory for information measurement,and proposing a dynamic configuration model for information systems.In this way,it establishes a basic theoretical framework for ISD.The proposed theoretical methodologies have been successfully applied and verified in the Smart Court So Ss Engineering Project of China and have achieved significant improvements in the quality and efficiency of Chinese court informatization.The proposed ISD provides an innovative paradigm for the analysis,design,development,and evaluation of large-scale complex information systems,such as electronic government and smart cities.
文摘This research aims to integrate Bekenstein’s bound and Landauer’s principle, providing a unified framework to understand the limits of information and energy in physical systems. By combining these principles, we explore the implications for black hole thermodynamics, astrophysics, astronomy, information theory, and the search for new laws of nature. The result includes an estimation of the number of bits stored in a black hole (less than 1.4 × 10<sup>30</sup> bits/m<sup>3</sup>), enhancing our understanding of information storage in extreme gravitational environments. This integration offers valuable insights into the fundamental nature of information and energy, impacting scientific advancements in multiple disciplines.
文摘Both consciousness and quantum phenomenon are subjective and indeterministic. In this paper, we propose consciousness is a quantum phenomenon. A quantum theory of consciousness (QTOC) is presented based on a new interpretation of quantum physics. We show that this QTOC can address the mind and body problem, the hard problem of consciousness. It also provides a physics foundation and mathematical formulation to study consciousness and neural network. We demonstrate how to apply it to develop and extend various models of consciousness. We show the predictions from this theory about the existence of a universal quantum vibrational field and the large-scale, nearly instantaneous synchrony of brainwaves among different parts of brain, body, people, and objects. The correlation between Schumann Resonances and some brainwaves is explained. Recent progress in quantum information theory, especially regarding quantum entanglement and quantum error correction code, is applied to study memory and shed new light in neuroscience.
文摘Since the 1970s, the application of information processing and information technology has gradually played a dominant role in Western economy, especially in American economy. Since the mid1980s, investment of the United States, Europe and Japan in information technology has increased drastically, making information
基金The research result of“A Bite of Teochew Cuisine”of Guangdong Quality Project(Open Online Course)“The Creation of Excellent Science Popularization Works for Chinese Molecular Cooking Micro-course”of Guangdong Science and Technology Program(Project No.:2019A141405059).
文摘“Cantonese Cuisine Master”project is an important policy proposed by China to inherit Cantonese Cuisine culture,promote employment,and achieve targeted poverty reduction and rural revitalization.Confronted with the demands of more diverse education,it is an essential opportunity and task for the education system to consider how to construct high-quality online courses and pursue higher-quality“Cantonese Cuisine Master”projects in line with the new era.This paper,based on the theory of instructional media and information processing theory,will further clarify the demand,dilemma,and developing strategy of online course construction for culinary majors,and explore its construction and practice with the example of“A Bite of Teochew Cuisine,”a Guangdong first-class course.
文摘Vocabulary acquisition is an intricate process,which has a close relationship with memory.In cognitive psychology,a large number of studies on memory system have been conducted based on the information processing theory,placing great value on second language learners’cognitive process.This study intends to probe into second language vocabulary acquisition from the perspective of information processing theory in hope to help learners acquire vocabulary more scientifically and efficiently.
文摘This study explores the capabilities of ChatGPT, specifically in relation to consciousness and its performance in the Turing Test. The article begins by examining the diverse perspectives among both the cognitive and AI researchers regarding ChatGPT’s ability to pass the Turing Test. It introduces a hierarchical categorization of the test versions, suggesting that ChatGPT approaches success in the test, albeit primarily with na?ve users. Expert users, conversely, can easily identify its limitations. The paper presents various theories of consciousness, with a particular focus on the Integrated Information Theory proposed by Tononi. This theory serves as the framework for assessing ChatGPT’s level of consciousness. Through an evaluation based on the five axioms and theorems of IIT, the study finds that ChatGPT surpasses previous AI systems in certain aspects;however, ChatGPT significantly falls short of achieving a level of consciousness, particularly when compared to biological sentient beings. The paper concludes by emphasizing the importance of recognizing ChatGPT and similar generative AI models as highly advanced and intelligent tools, yet distinctly lacking the consciousness attributes found in advanced living organisms.
文摘In Quantum Information Theory(QIT) the classical measures of information content in probability distributions are replaced by the corresponding resultant entropic descriptors containing the nonclassical terms generated by the state phase or its gradient(electronic current). The classical Shannon(S[p]) and Fisher(I[p]) information terms probe the entropic content of incoherent local events of the particle localization, embodied in the probability distribution p, while their nonclassical phase-companions, S[ Φ ] and I[ Φ ], provide relevant coherence information supplements.Thermodynamic-like couplings between the entropic and energetic descriptors of molecular states are shown to be precluded by the principles of quantum mechanics. The maximum of resultant entropy determines the phase-equilibrium state, defined by "thermodynamic" phase related to electronic density,which can be used to describe reactants in hypothetical stages of a bimolecular chemical reaction.Information channels of molecular systems and their entropic bond indices are summarized, the complete-bridge propagations are examined, and sequential cascades involving the complete sets of the atomic-orbital intermediates are interpreted as Markov chains. The QIT description is applied to reactive systems R = A―B, composed of the Acidic(A) and Basic(B) reactants. The electronegativity equalization processes are investigated and implications of the concerted patterns of electronic flows in equilibrium states of the complementarily arranged substrates are investigated. Quantum communications between reactants are explored and the QIT descriptors of the A―B bond multiplicity/composition are extracted.
文摘in this paper, a new approach to relativistic information entropy is used to assess some relative uncertainties in structural reliability assessment. This approach is composed of the information theory and the relativistic theory, and can be used to measure the relativity of parameter uncertainty and system uncertainty in structural reliability theory based on the same generalized relativistic reference system. Therefore, the structural reliability assessment can be assessed reasonably by the approach.
基金Project supported by the National Key Basic Research and Development Program,China (Grant Nos.2012CB955902 and 2013CB430204)the National Natural Science Foundation of China (Grant Nos.41305059,41305100,41275096 and 41105070)
文摘Based on nonlinear prediction and information theory, vertical heterogeneity of predictability and information loss rate in geopotential height field are obtained over the Northern Hemisphere. On a seasonal-to-interannual time scale, the predictability is low in the lower troposphere and high in the mid-upper troposphere. However, within mid-upper troposphere over the subtropics ocean area, there is a relatively poor predictability. These conclusions also fit the seasonal time scale. Moving to the interannual time scale, the predictability becomes high in the lower troposphere and low in the mid-upper troposphere, contrary to the former case. On the whole the interannual trend is more predictable than the seasonal trend. The average information loss rate is low over the mid-east Pacific, west of North America, Atlantic and Eurasia, and the atmosphere over other places has a relatively high information loss rate on all-time scales. Two channels are found steadily over the Pacific Ocean and Atlantic Ocean in subtropics. There are also unstable channels. The four- season influence on predictability and information communication are studied. The predictability is low, no matter which season data are removed and each season plays an important role in the existence of the channels, except for the winter. The predictability and teleconnections are paramount issues in atmospheric science, and the teleconnections may be established by communication channels. So, this work is interesting since it reveals the vertical structure of predictability distribution, channel locations, and the contributions of different time scales to them and their variations under different seasons.
文摘A normalized measure is established to provide the quantitative information about the degree of observability for the discrete-time, stochastically autonomous system. This measure is based on the generalized information theoretic quantities (generalized entropy, mutual information) of the system state and the observations, where the system state can be a discrete or a continuous random vector. Some important properties are presented. For the linear case, the explicit formula for the degree of observability is derived, and the equivalence between the proposed measure and the traditional rank condition is proved. The curves for the degree of observability are depicted in a simple example.
文摘We consider a quadratic Gaussian distributed lossy source coding setup with an additional constraint of identical reconstructions between the encoder and the decoder.The setup consists of two correlated Gaussian sources,wherein one of them has to be reconstructed to be within some distortion constraint and match with a corresponding reconstruction at the encoder,while the other source acts as coded side information.We study the tradeoff between the rates of two encoders for a given distortion constraint on the reconstruction.An explicit characterization of this trade-off is the main result of the paper.We also give close inner and outer bounds for the discrete memoryless version of the problem.
文摘Market data for financial studies typically derives from either historical transactions or contemporaneous surveys of sentiment and perceptions.The research communities analyzing data from these opposing categories of source data see themselves as distinct,with advantages not shared by the other.This research investigates these latter claims in an information theoretic context,and suggests where methods and controls can be improved.The current research develops a Fisher Information metric for Likert scales,and explores the effect of particular survey design decisions or results on the information content.A Fisher Information metric outperforms earlier metrics by converging reliably to values that are intuitive in the sense that they suggest that information captured from subjects is fairly stable.The results of the analysis suggest that varying bias and response dispersion inherent in specific surveys may require increases of sample size by several orders of magnitude to compensate for information loss and in order to derive valid conclusions at a given significance and power of tests.A prioritization of quality of design,and the factors relevant to survey design are presented in the conclusions,and illustrative examples provide insight and guidance to the assessment of information content in a survey.