期刊文献+

基于卷积网络的Adam算法的改进 被引量:1

Improvement of Adam Algorithm Based on Convolution Network
下载PDF
导出
摘要 Adam算法作为卷积神经网络常用的优化算法,虽具备收敛快的优点,但该算法往往在网络训练后期存在极端学习率,甚至存在不收敛现象.为此提出了Yadamod算法,在Adam算法的二阶动量项中加入激活函数,并采用对学习率进行指数加权平均和增加动态边界的方法,解决了极端学习率问题.使用随机一阶复杂度(SFO)框架对Yadamod算法在随机非凸情况下进行了收敛性分析.基于ResNet-34和ResNet-50卷积神经网络分别在CIFAR-10数据集和CIFAR-100数据集上验证Yadamod算法中的有效性,实验结果表明该算法在稳定性和优化性能方面都优于Adam算法. As a common optimization algorithm of convolutional neural network,Adam algorithm has the advantage of fast convergence,but it often has extreme learning rate in the late stage of network training,and even does not converge.In this paper,Yadamod algorithm is proposed.The activation function is added to the second order momentum term of Adam algorithm,and the method of exponentially weighted average of learning rate and adding dynamic boundary is adopted to solve the problem of extreme learning rate.The convergence of Yadamod algorithm in random non-convex cases is analyzed using the stochastic first-order complexity(SFO)framework.Based on ResNet-34 and ResNet-50 convolutional neural network,the validity of Yadamod algorithm is verified on CIFAR-10 dataset and CIFAR-100 dataset respectively.The experimental results show that the algorithm is superior to Adam algorithm in stability and optimization performance.
作者 董文静 赵月爱 DONG Wenjing;ZHAO Yueai(School of Mathematics and Statistics,Taiyuan Normal University,Shanxi Jinzhong 030619,China)
出处 《太原师范学院学报(自然科学版)》 2023年第3期5-12,共8页 Journal of Taiyuan Normal University:Natural Science Edition
基金 国家社科基金项目(ZOBJL080) 山西省“1331工程”平台项目(PT201818) 山西省重点研发计划项目(201803D121088).
关键词 激活函数 Adam算法 指数加权平均 收敛性分析 activation function adam algorithm index weighted average convergence analysis
  • 相关文献

参考文献5

二级参考文献20

共引文献79

同被引文献8

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部