期刊文献+

一种动态构建深度信念网络模型方法 被引量:1

A dynamic method of constructing deep belief network models
下载PDF
导出
摘要 针对深度信念网络无法科学有效地确定网络模型深度和隐层神经元数目等问题,根据贪心算法思想,提出了一种动态构建深度信念网络模型的新方法.即从底层逐层构建深度信念网络的过程中,根据验证集错误分类率调整当前层神经元数目,使当前模型达到最优后,固定当前层神经数目,网络深度增加一层;继续调整下一层神经元数目,直至整个模型构建完成.最后,根据重构误差微调各层神经元数目.结果表明,与依据重构误差构建的深度信念模型相比,利用此方法构建的深度信念网络模型的分类准确率更高. In view of the fact that deep belief networks could not determine the depth of the network and the number of hidden units, a dynamic method of constructing deep belief networks was proposed according to the idea of the greedy algorithm. In the process of constructing deep belief networks from the bottom layer, the number of hidden units in the current layer was adjusted according to the error classification rate of the v erification set. After the current model was optimized, the number of hidden units of the current layer was fixed, and the depth of the network was increased by one layer. The number of hidden units in the next layer was adjusted as the same ways until the whole model was completed. Finally, the number of hidden units in each layer was tuned according to the reconstruction error. The experimental results show that the model constructed according to this method is more accurate than the model constructed based on the reconstruction error.
作者 吴强 杨小兵 WU Qiang;YANG Xiaobing(College of Information Engineering,China Jiliang University,Hangzhou 310018,China)
出处 《中国计量大学学报》 2018年第1期64-70,共7页 Journal of China University of Metrology
关键词 动态构建 深度信念网络 模型深度 神经元数目 dynamic construct deep belief network depth of model number of hidden units
  • 相关文献

参考文献2

二级参考文献24

  • 1王俊清.BP神经网络及其改进[J].重庆工学院学报,2007,21(5):75-77. 被引量:23
  • 2Lee T S, Mumford D. Hierarchical Bayesian inference in the visual cortex[J]. Optical Society of America, 2003, 20(7): 1434-1448.
  • 3Rossi A F, Desimone R, Ungerleider L G. Contextual modulation in primary visual cortex of macaques[J]. J of Neuroscience, 2001, 21(5): 1689-1709.
  • 4Hinton G E, Salakhutdinov R R. Reducing the dimensionality of data with neural networks[J]. Science, 2006, 313(5786): 504-507.
  • 5Dahl G E, Yu D, Deng L, et al. Large vocabulary continuous speech recognition with context-dependent DBN-HMMS[C]. Proc of IEEE Int Conf on Acoustics, Speech and Signal Processing. Prague, 2011:4688-4691.
  • 6Deselaers T, Hasan S, Bender O, et al. A deep learning approach to machine transliteration[C]. Proc of the 4th Workshop on Statistical Machine Translation. Athens, 2009: 233-241.
  • 7Fasel I, Berry J. Deep belief networks for real-time extraction of tongue contours from ultrasound duringspeech[C]. Proc of the 20th Int Conf on Pattern Recognition. Stroudsburg: Association for Computational Linguistics, 2010: 1493-1496.
  • 8Deng L, Seltzer M L, Yu D, et al. Binary coding of speech spectrograms using a deep auto-encoder[C]. Proc of the 1 lth Annual Conf on Int Speech Communication Association. Makuhair, 2010: 1692-1695.
  • 9Bengio Y. Learning deep architectures for AI[J]. Foundations & Trends in Machine Learning, 2009, 2(1): 1-127.
  • 10Yoshua Bengio, Pascal Lamblin, Dan Popovici, et al. Greedy layer-wise training of deep networks[C]. Advances in Neural Information Processing Systems 19 (NIPS 2006). Vancouver, 2007: 153-160.

共引文献52

同被引文献7

引证文献1

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部