摘要
为了解决深度学习中使用线性修正函数Re LUs对于模型的表达能力欠缺,而柔性光滑函数Softplus无稀疏表达能力的问题。基于Re LUs和Softplus函数各自的优点,将Re LUs函数的稀疏表达能力和Softplus函数的光滑特性结合起来,提出一种使用非线性修正函数作为神经元激励的方法。分析了不同激活函数的性能,并且用卷积神经网络在MNIST和CIFAR-10标准数据库上进行图像分类识别实验。实验结果表明,使用非线性修正激活函数,不仅可以加快网络收敛速度,也可以提高识别准确率;同时还不依赖于池化方法的选择。
To solve the problems of ReLUs ( Rectified Linear Units) is lack of expression of the model and Softplus function without the ability of sparse expression when training deep learning network model. Based on the advantages of both ReLUs and Softplus each, by combing ReLUs' sparsity with Softplus' smoothness, a method of rectified nonlinear units function is used as excitation of neuron is presented. The performance of different activation functions is analyzed and image classification and recognition experiments is conducted on MNIST and CIFAR-10 database using Convolutional Neural Network of different activation function. Experimental result shows that the performance of network using rectified nonlinear units function is not only can accelerate the convergence speed, but also can improve the recognition accuracy, and it is not dependent on the choice of the pooling method.
出处
《科学技术与工程》
北大核心
2015年第34期221-225,共5页
Science Technology and Engineering
关键词
深度学习
卷积神经网络
模式识别
非线性修正函数
deep learning convolutional neural networks pattern recognition rectified nonlinear units function