期刊文献+

基于PReLUs-Softplus非线性激励函数的卷积神经网络 被引量:13

Convolutional neural network based on PRe LUs-Softplus nonlinear excitation function
下载PDF
导出
摘要 针对卷积神经网络表达能力和识别效果受卷积层激励函数影响的问题,提出了一种新型非线性激励函数PRe LUs-Softplus,并将其应用于神经网络卷积层.对新型神经网络和采用传统激励函数的神经网络在MNIST和CIFAR-10标准数据库上进行了图像识别对比实验,结果表明,相比于采用传统激励函数的神经网络,使用PRe LUs-Softplus激励函数的卷积神经网络在不同的池化方式下图像识别计算收敛速度更快,显著降低了识别的错误率. Aiming at the problem that the expression ability and recognition effect of convolutional neural network( CNN) are affected by the excitation function of convolutional layer,a newnonlinear excitation function PRe LUs-Softplus was proposed and applied to the convolutional layer in neural network. The contrast experiments on the image recognition of both newneural network and neural network with the traditional excitation function were performed in MNIST and CIFAR-10 standard database. The results showthat compared with the neural network with the traditional excitation function,the convolutional neural network with PRe LUs-Softplus excitation function has faster convergence rate in the calculation of image recognition under different pooling methods,and can effectively reduce the recognition error rate.
作者 郜丽鹏 郑辉
出处 《沈阳工业大学学报》 EI CAS 北大核心 2018年第1期54-59,共6页 Journal of Shenyang University of Technology
基金 国家自然科学基金资助项目(61571146)
关键词 深度学习 卷积神经网络 激励函数 模式识别 非线性映射 池化 网络结构 图像识别 deep learning convolutional neural network excitation function pattern recognition nonlinea mapping pooling network structure image recognition
  • 相关文献

参考文献1

二级参考文献1

共引文献24

同被引文献139

引证文献13

二级引证文献70

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部