摘要
HNN是一类基于物理先验学习哈密尔顿系统的神经网络.本文通过误差分析解释使用不同积分器作为超参数对HNN的影响.如果我们把网络目标定义为在任意训练集上损失为零的映射,那么传统的积分器无法保证HNN存在网络目标.我们引进反修正方程,并严格证明基于辛格式的HNN具有网络目标,且它与原哈密尔顿量之差依赖于数值格式的精度.数值实验表明,由辛HNN得到的哈密尔顿系统的相流不能精确保持原哈密尔顿量,但保持网络目标;网络目标在训练集、测试集上的损失远小于原哈密尔顿量的损失;在预测问题上辛H N N较非辛H N N具备更强大的泛化能力和更高的精度.因此,辛格式对于HNN是至关重要的.
HNN is a class of neural networks on grounds of physical prior for learning Hamiltonian systems.This paper explains the influences of different integrators as hyper-parameters on the HNN through error analysis.If we define a network target as a mapping with zero loss on any training dataset,then the conventional integrators can not guarantee the existence of the network targets of HNN.We introduce inverse modified equations for HNN,prove that the HNN based on symplectic integrators possess network targets and the differences between the network targets and the original Hamiltonians depend on the accuracy orders of the integrators.Our numerical experiments show that the phase flows of the Hamiltonian systems obtained by symplectic HNN do not exactly preserve the original Hamiltonians,but preserve the network targets calculated;the loss of the network target on the training set and the test set is much less than the loss of the original Hamiltonian;the symplectic HNN have more powerful generalization ability and higher accuracy than the non-symplectic HNN in addressing predicting issues.Thus,the symplectic integrators are of critical importance for HNN.
作者
祝爱卿
金鹏展
唐贻发
Zhu Aiqing;Jin Pengzhan;Tang Yifa(LSEC,Institute of Computational Mathematics and ScieTitific/Engineering Computing,Academy of Mathematics and Systems Science,Chinese Academy of Sciences,Beijing 100190,China;School of Mathematical Sciences,University of Chinese Academy of Sciences,Beijing 100049,China)
出处
《计算数学》
CSCD
北大核心
2020年第3期370-384,共15页
Mathematica Numerica Sinica
基金
科技部“新一代人工智能”重大专项(2018AAA0101002)
国家自然科学基金项目(11771438).
关键词
神经网络
HNN
网络目标
反修正方程
辛格式
误差分析
Neural networks
HNN
Network target
Inverse modified equations
Symplectic integrator
Error analysis