摘要
卷积神经网络中激活函数的作用是激活神经元的特征,然后保留并映射出来,这是人工神经网络能模拟人脑机制,解决非线性问题的关键.针对传统卷积神经网络出现的震荡、不收敛甚至过拟合的情况,对激活ReLU函数进行优化.提出一种新型矫正激活函数,称其为ReLU阈值函数.通过对数据集caltech101和caltech256进行训练,证明其在图片分类上的性能要优于ReLU函数.其中用Alexnet网络模型对caltech101数据集进行训练时的分类准确率由之前的97.7%提高到99.3%,对caltech256数据集进行训练时的分类准确率由之前的65.4%提高到92.3%.
The purpose of the activation function in the convolutional neural network is to activate the characteristics of neurons and then retain and map them. It is the key point that the artificial eural network can simulate human brain mechanisms and solve nonlinear problems. In this paper, to solve problems such as oscillation, non-convergence and even over-fitting of traditional convolutional neural networks, the activation of ReLU function is optimised. A new type of corrective activation function is proposed, which is called the ReLU threshold function. By training the data sets caltech101 and caltech256, it is proved that its performance in image classification is better than the ReLU function. The classification-accuracy rate of the caltech101 dataset was improved from 97.7% to 99.3% when using the Alexnet network model, and when training the caltech256 dataset. It was improved from 65.4% to 92.3%.
作者
刘小文
郭大波
李聪
LIU Xiaowen;GUO Dabo;LI Cong(College of Physics and Electronic Engineering, Shanxi University, Taiyuan 030006, China)
出处
《测试技术学报》
2019年第2期121-125,共5页
Journal of Test and Measurement Technology
基金
山西省基础研究资助项目(201601D102033
201801D121118)