期刊文献+

不同池化模型的卷积神经网络学习性能研究 被引量:81

Learning performance of convolutional neural networks with different pooling models
原文传递
导出
摘要 目的基于卷积神经网络的深度学习算法在图像处理领域正引起广泛关注。为了进一步提高卷积神经网络特征提取的准确度,加快参数收敛速度,优化网络学习性能,通过对比不同的池化模型对学习性能的影响提出一种动态自适应的改进池化算法。方法构建卷积神经网络模型,使用不同的池化模型对网络进行训练,并检验在不同迭代次数下的学习结果。在现有算法准确率不高和收敛速度较慢的情况下,通过使用不同的池化模型对网络进行训练,从而构建一种新的动态自适应池化模型,并研究在不同迭代次数下其对识别准确率和收敛速度的影响。结果通过对比实验发现,使用动态自适应池化算法的卷积神经网络学习性能最优,在手写数字集上的收敛速度最高可以提升18.55%,而模型对图像的误识率最多可以降低20%。结论动态自适应池化算法不但使卷积神经网络对特征的提取更加精确,而且很大程度地提高了收敛速度和模型准确率,从而达到优化网络学习性能的目的。这种模型可以进一步拓展到其他与卷积神经网络相关的深度学习算法。 Objective Deep learning algorithms based on convolutional neural networks are attracting attention in the field of image processing. To improve the accuracy of the feature extraction process and the convergence rate of parameters, as well as optimize the learning performance of the network, an improved dynamic adaptive pooling algorithm is proposed, which compares the effect of different pooling models on learning performance. Method A convolutional neural network model, which is trained with different pooling models, is constructed. The results of the trained mode[ are verified in different itera- tions. To compensate for low accuracy and slow convergence speed, a dynamic adaptive pooling model is proposed, which trains the network with different pooling models. The effect of the model on the accuracy and convergence rate in different iterations are then studied. Result Contrast experiment shows that the dynamic pooling model has optimal learning perform- ance. The maximum improvement of the convergence rate on handwritten database is 18.55% and the maximum decrement of the accuracy rate is 20%. Conclusion A dynamic adaptive pooling algorithm can improve the accuracy of feature extrac- tion, convergence rate, and accuracy of the convolutional neural network, thereby optimizing network learning performance.The dynamic adaptive pooling model can be further extended to other deep learning algorithms related to convolutional neu- ral networks.
出处 《中国图象图形学报》 CSCD 北大核心 2016年第9期1178-1190,共13页 Journal of Image and Graphics
基金 国家自然科学基金项目(61172144) 辽宁省教育厅科学技术研究一般项目(L2015216)~~
关键词 深度学习 卷积神经网络 图像识别 特征提取 算法收敛 动态自适应池化 deep learning convolutional neural network image recognition feature extraction algorithm convergence dynamic adaptive pooling
  • 相关文献

参考文献8

二级参考文献273

  • 1BENGIO Y, DELALLEAU O. On the expressive power of deep archi- tectures[ C ]//Proc of the 14th International Conference on Discovery Science. Berlin : Springer-Verlag, 2011 : 18 - 36.
  • 2BENGIO Y. Leaming deep architectures for AI[ J]. Foundations and Trends in Machine Learning ,2009,2 ( 1 ) : 1-127.
  • 3HINTON G,OSINDERO S,TEH Y. A fast learning algorithm for deep belief nets [ J ]. Neural Computation ,2006,18 (7) : 1527-1554.
  • 4BENGIO Y, LAMBLIN P, POPOVICI D, et al. Greedy layer-wise training of deep networks [ C ]//Proc of the 12th Annual Conference on Neural Information Processing System. 2006:153-160.
  • 5LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-based learning ap- plied to document recognition[ J]. Proceedings of the iEEE, 1998, 86( 11 ) :2278-2324.
  • 6VINCENT P, LAROCHELLE H, BENGIO Y, et al. Extracting and composing robust features with denoising autoencoders[ C ]//Proc of the 25th International Conference on Machine Learning. New York: ACM Press ,2008 : 1096-1103.
  • 7VINCENT P, LAROCHELLE H, LAJOIE I, et aL Stacked denoising autoencoders:learning useftd representations in a deep network with a local denoising criterion [ J ]. Journal of Machine Learning Re- search ,2010,11 ( 12 ) :3371-3408.
  • 8YU Dong, DENG Li. Deep convex net: a scalable architecture for speech pattern classification [ C]//Proc of the 12th Annual Confe-rence of International Speech Comunication Association. 2011 : 2285- 2288.
  • 9POON H, DOMINGOS P. Sum-product networks:a new deep architec- ture[ C ]//Proc of IEEE Intemational Conference on Computer Vi- sion. 2011:689-690.
  • 10BENGIO Y,LECUN Y. Scaling learning algorithms towards AI[ M]// BOTTOU L,CHAPELLE O, DeCOSTE D,et al. Large-Scale Kernel Machines. Cambridge: MIT Press ,2007:321-358.

共引文献1670

同被引文献598

引证文献81

二级引证文献521

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部