期刊文献+

基于小波池的图像分类算法研究

Image Classification Algorithm Based on Wavelet Pooling
下载PDF
导出
摘要 本文提出了一种基于二级小波分解的小波池算法。算法放弃了最近邻插值方法,取而代之的是一种有机的、子带的方法,方法能够更准确地表示特征内容,并且伪影更少。将所提出的池化方法与最大池化、平均池化、混合池化、随机池化进行比较,在基准图像数据集MNIST和CIFAR10上测试此方法,证明此方法能产生相等或更优结果,有很好的有效性。 This paper proposes a wavelet pool algorithm based on two-level wavelet decomposition.The method abandons the nearest neighbor interpolation method and replaces it with an organic,sub-band method,which can more accurately represent feature content and have fewer artifacts.It compares the proposed pooling method with maximum pooling,average pooling,mixed pooling,random pooling,and tests these methods on the benchmark image datasets MNIST and CIFAR10 to verify its effectiveness and the ability to produce near-equal or better results.
作者 车大庆 吕建秋 CHE Da-qing;LV Jian-qiu(South China Agricultural University,College of Mathematics and Informatics,Guangzhou 510642 China;South China Agricultural University,Institute of Innovation Methods,Guangzhou 510642 China;Guangdong Institute of Science and Technology Management and Planning,Guangzhou 510642 China)
出处 《自动化技术与应用》 2022年第7期98-100,162,共4页 Techniques of Automation and Applications
关键词 图像分类 小波池 算法改进 image classification wavelet pooling algorithm improvement
  • 相关文献

参考文献3

二级参考文献26

  • 1Larochelle H, Erhan D, Vincent P. Deep Learning Using Robust Interdependent Codes// Proc of the 12th International Conference on Artificial Intelligence and Statistics. Clearwater, USA, 2009: 312-319.
  • 2Vincent P, Larochelle H, Bengio Y, et al. Extracting and Compo- sing Robust Features with Denoising Autoencoders // Proc of the 25th International Conference on Machine I.earning. Helsinki, Fin- land, 2008:1096-1103.
  • 3Hinton G E, Osindem S, Teh Y W. A Fast Learning Algorithm for Deep Belief Nets. Neural Computation, 2006, 18(7): 1527-1554.
  • 4Baldi P. Autoencodcrs, Unsupervised Learning, and Deep Architectures [ EB/OL]. [2013-09-01 ]. http://jmlr, org/prozceedings/ papers/V27/baldil 2a/baldil 2a. pdf.
  • 5Gluge S, Back R, Wendemuth A. Auto-Encoder Pre-training of Seg- mented-Memory Recurrent Neural Networks// Prnc of the 21st Euro- pean Symposium on Artificial Neural Networks, Computational Intelli- gence and Machine Learning. Bruges, Belgium, 2013:29-34.
  • 6Jaitly N, Hinton G E. Using an Autoencoder with Deformable Tem- plates to Discover Features for Automated Speech Recognition // Proc of the 14th Annual Conference of the International Speech Communication Association. Lyon, France, 2013 : 1737-1740.
  • 7Mesnil G, Dauphin Y, Glorot X, et al. Unsupervised and Transfer Learning Challenge: A Deep l.earning Approach [ EB/OI,]. [ 2013- 09-O1 ]. http://www, ivo. umontreal, ca/- lisa/pointcurs/UTIC LISA. pdf.
  • 8Hinton G E. Training Products of Experts by Minimizing Contrastive Divergence. Neural Computation, 2002, 14(8) : 1771-1800.
  • 9Erhan D, Bengio Y, Courville A, et al. Why Does Unsupervised Pre-training Help Deep Learning? Journal of Machine Learning Re- search, 2010, 11:625-660.
  • 10Vincent P, Larochelle H, Lajoie I, et al. Stacked Denoising Au- toencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion. Journal of Machine Learning Re- search, 2010, 11:3371-3408.

共引文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部