A thermodynamic formalism describing the efficiency of information learning is proposed,which is applicable to stochastic thermodynamic systems with multiple internal degrees of freedom.The learning rate,entropy produ...A thermodynamic formalism describing the efficiency of information learning is proposed,which is applicable to stochastic thermodynamic systems with multiple internal degrees of freedom.The learning rate,entropy production rate and entropy flow from the system to the environment under coarse-grained dynamics are derived.The Cauchy–Schwarz inequality is applied to demonstrate the lower bound on the entropy production rate of an internal state.The inequality of the entropy production rate is tighter than the Clausius inequality,leading to a derivative of the upper bound on the efficiency of learning.The results are verified in cellular networks with information processes.展开更多
基金supported by the National Natural Science Foundation of China(Grant No.12075197)the Fundamental Research Fund for the Central Universities(Grant No.20720210024)the Natural Science Foundation of Fujian Province(Grant No.2023J01006)。
文摘A thermodynamic formalism describing the efficiency of information learning is proposed,which is applicable to stochastic thermodynamic systems with multiple internal degrees of freedom.The learning rate,entropy production rate and entropy flow from the system to the environment under coarse-grained dynamics are derived.The Cauchy–Schwarz inequality is applied to demonstrate the lower bound on the entropy production rate of an internal state.The inequality of the entropy production rate is tighter than the Clausius inequality,leading to a derivative of the upper bound on the efficiency of learning.The results are verified in cellular networks with information processes.