摘要
提出一种自适应学习率优化算法,可以根据损失函数的变化自动调节学习率大小,且能根据损失函数的变化幅度而调节变化程度;结合tanh函数,保证学习率不会因损失函数变化幅度过大而剧烈变化;在收敛后期,可以将学习率保持在一个固定值范围,以防止后期学习率变化而导致的收敛速度慢或精度差等问题。利用波士顿房价预测和Mnist手写识别两种数据集进行实验验证,通过对比所提出的改进算法与经典梯度下降优化算法,验证了本文算法的可行性和高效性。
In this paper, an adaptive learning rate optimization algorithm is proposed, which can automatically adjust the learning rate according to the change of the loss function, and adjust the change degree according to the change range of the loss function;Combined with tanh function, the learning rate will not change dramatically with the change of loss function;In the later stage of convergence, the learning rate can be kept in a fixed range to prevent the problem of slow convergence or poor accuracy caused by the change of learning rate in the later stage. Through the Boston house price prediction and Mnist handwriting recognition data sets, the feasibility and efficiency of the proposed algorithm are verified by comparing with the classical gradient descent optimization algorithm.
作者
宋美佳
贾鹤鸣
林志兴
卢仁盛
刘庆鑫
SONG Mei-jia;JIA He-ming;LIN Zhi-xing;LU Ren-sheng;LIU Qing-xin(Center of Network,Sanming University,Sanming 365004,China;School of Information Engineering,Sanming University,Sanming 365004,China;School of Computer Science and Technology,Hainan University,Haikou 570228,China)
出处
《三明学院学报》
2021年第6期36-44,共9页
Journal of Sanming University
基金
福建省自然科学基金项目(2021J011128)
福建省电子商务工程中心开放课题(KBX2109)
国家级大学生创新创业计划项目(202111311022X,202111311014)。
关键词
学习率
梯度下降
优化算法
自适应
learning rate
gradient decent
optimization algorithm
self-adaption