摘要
Does non-transitivity in information theory have an analog in thermodynamics? A non-transitive game, “Swap”, is used as a toy thermodynamic model to explore concepts such as temperature, heat flow, equilibrium and entropy. These concepts, found to be inadequate for non-transitive thermodynamic, need to be generalized. Two kinds of temperatures, statistical and kinetic, are distinguished. Statistical temperature is a parameter in statistical distributions. Kinetic temperature is proportional to the expected kinetic energy based on its distribution. Identical for Maxwell-Boltzmann statistics, these temperatures differ in non-Maxwellian statistics when a force is present. Fourier’s law of conduction and entropy should be expressed using statistical temperature, not kinetic temperature. Kinetic temperature is always scalar but statistical temperature and statistical entropy in non-transitive systems have circulation, thereby allowing continuous and circular heat flow. Entropy is relative to underlying statistics, in analogy to the Kullback-Leibler divergence in information theory. The H-theorem, limited by assumptions of homogeneity and indistinguishability, only covers statistically homogeneous systems. The theorem does not cover non-transitive, statistically heterogeneous systems combining different distributions such as Maxwell-Boltzmann, biased half-Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein. The second law can be preserved if generalized by expressing it in terms of statistical temperature and statistical entropy.
Does non-transitivity in information theory have an analog in thermodynamics? A non-transitive game, “Swap”, is used as a toy thermodynamic model to explore concepts such as temperature, heat flow, equilibrium and entropy. These concepts, found to be inadequate for non-transitive thermodynamic, need to be generalized. Two kinds of temperatures, statistical and kinetic, are distinguished. Statistical temperature is a parameter in statistical distributions. Kinetic temperature is proportional to the expected kinetic energy based on its distribution. Identical for Maxwell-Boltzmann statistics, these temperatures differ in non-Maxwellian statistics when a force is present. Fourier’s law of conduction and entropy should be expressed using statistical temperature, not kinetic temperature. Kinetic temperature is always scalar but statistical temperature and statistical entropy in non-transitive systems have circulation, thereby allowing continuous and circular heat flow. Entropy is relative to underlying statistics, in analogy to the Kullback-Leibler divergence in information theory. The H-theorem, limited by assumptions of homogeneity and indistinguishability, only covers statistically homogeneous systems. The theorem does not cover non-transitive, statistically heterogeneous systems combining different distributions such as Maxwell-Boltzmann, biased half-Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein. The second law can be preserved if generalized by expressing it in terms of statistical temperature and statistical entropy.