期刊文献+

深度学习中三种常用激活函数的性能对比研究 被引量:3

A Comparative Study on the Performance of three kinds of Activation Functions in Deep Learning
下载PDF
导出
摘要 以目前神经网络研究领域应用比较广泛的Sigmoid函数、双曲正切函数以及校正激活函数为研究对象,通过谷歌发布的机器学习平台Tensorflow中的神经网络可视化实验操作平台Playground,在给定网络拓扑结构的情况下,对三种激活函数的性能展开对比研究。分别在调整学习率与网络深度的情况下观察基于上述三种激活函数的神经网络的分类效果。通过分析实验结果,探讨三种激活函数的应用场景,并对深度神经网络中激活函数的选择方式给出建议。 Put the S function,hyperbolic tangent function and Rectified Linear Units function which are widely used in the research field of the neural network as the research object.The performance of three kinds of activation functions is compared in the case of a given network topology through the Playground in Tensorflow,an experimental platform for visualization of neural networks published by Google.The classification results of neural networks based on the above three activation functions were observed under the condition of adjusting the learning rate and the depth of the network respectively.By analyzing the experimental data,the application scenarios of the three kinds of activation functions are discussed and the selection of the activation function in the depth neural network is given.
作者 周畅 米红娟 ZHOU Chang;MI Hongjuan(Department of Information Engineering,Lanzhou University of Finance and Economics,Lanzhou 730020)
出处 《北京电子科技学院学报》 2017年第4期27-32,共6页 Journal of Beijing Electronic Science And Technology Institute
关键词 激活函数 深度学习 SIGMOID函数 Tanh函数 ReLU函数 activation function deep neural network Sigmoid Tanh ReLU
  • 相关文献

参考文献5

二级参考文献59

共引文献70

同被引文献33

引证文献3

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部