期刊文献+

基于改进并行回火算法的RBM网络训练研究 被引量:7

Research on RBM Networks Training Based on Improved Parallel Tempering Algorithm
下载PDF
导出
摘要 目前受限玻尔兹曼机网络训练算法主要是基于采样的算法.当用采样算法进行梯度计算时,得到的采样梯度是真实梯度的近似值,采样梯度和真实梯度之间存在较大的误差,这严重影响了网络的训练效果.针对该问题,本文首先分析了采样梯度和真实梯度之间的数值误差和方向误差,以及它们对网络训练性能的影响,然后从马尔科夫采样的角度对以上问题进行了理论分析,并建立了梯度修正模型,通过修正梯度对采样梯度进行数值和方向的调节,并提出了基于改进并行回火算法的训练算法,即GFPT(Gradient fixing parallel tempering)算法.最后给出GFPT算法与现有算法的对比实验,仿真结果表明,GFPT算法可以极大地减小采样梯度和真实梯度之间的误差,大幅度提升受限玻尔兹曼机网络的训练效果. Currently, most algorithms for training restricted Boltzmann machines (RBMs) are based on multi-step Gibbs sampling. When the sampling algorithm is used to calculate gradient, the sampling gradient is an approximate value of the true gradient, and there is a big error between the sampling gradient and the true gradient, which seriously affects training effect of network. This article focuses on the problems mentioned above. Firstly, numerical error and direction error between gradient and true gradient sampling are analyzed, as well as their influences on the performance of network training. The problems are theoretically analyzed from the angle of Markov sampling. Then a gradient modification model is established to adjust the numerical value and direction of sampling gradient. Fhrthermore, improved tempering learning based algorithm is put forward, that is, GFPT (Gradient fixing parallel tempering) algorithm. Finally, a comparative experiment on the GFPT algorithm and existing algorithms is given. It demonstrated that GFPT algorithm can greatly reduce the sampling error between sampling gradient and true gradient, and improve RBM network training precision.
出处 《自动化学报》 EI CSCD 北大核心 2017年第5期753-764,共12页 Acta Automatica Sinica
基金 国家自然科学基金(61573285 61305133) 中央高校基本科研业务费专项基金(3102015BJ(Ⅱ)GH01)资助~~
关键词 深度学习 受限玻尔兹曼机 采样算法 马尔科夫理论 并行回火 GFPT Deep learning, restricted Boltzmann machine (RBM), sampling algorithm, Markov theory, parallel tempering, GFPT (Gradient fixing parallel tempering)
  • 相关文献

参考文献4

二级参考文献87

共引文献215

同被引文献52

引证文献7

二级引证文献20

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部