摘要
介绍了人工神经网络的二阶优化算法研究现状,对人工神经网络损失函数的KSD(Krylov Subspace Descent)优化算法进行改进。针对KSD算法中采用固定不变的Krylov子空间维数的方式,提出了Krylov子空间维数根据计算结果自适应改变的MKSD(Modified KSD)算法,并给出了利用MKSD、KSD以及SGD(Stochastic Gradient Descent)优化算法对不同问题的全连接神经网络进行训练的数值算例。计算结果说明MKSD的算法对比于其他算法具有一定的优势。
The development of algorithms for optimizing the loss function of artificial neural networks is introduced is this work.The KSD(Krylov Subspace Descent)algorithm is extended to MKSD(Modified KSD)algorithm which has adaptively variable subspace dimension instead of fixed dimension.Some numerical examples of optimizing the fully connected neural network problems by MKSD,KSD and SGD(Stochastic Gradient Descent)algorithms are given.The numerical results show that the MKSD method has certain advantages over other methods.
作者
张振宇
林沐阳
ZHANG Zhenyu;LIN Muyang(School of Mathematics,Shanghai University of Finance and Economics,Shanghai 200433;Shanghai University of Finance and Economics Zhejiang College,Jinhua 321013)
出处
《工程数学学报》
CSCD
北大核心
2022年第5期681-694,共14页
Chinese Journal of Engineering Mathematics
基金
国家自然科学基金(11671246)。