期刊文献+

基于Bagging-Down SGD算法的分布式深度网络 被引量:1

Distributed deep networks based on Bagging-Down SGD algorithm
下载PDF
导出
摘要 通过对大量数据进行训练并采用分布式深度学习算法可以学习到比较好的数据结构,而传统的分布式深度学习算法在处理大数据集时存在训练时间比较慢或者训练精度比较低的问题。提出Bootstrap向下聚合随机梯度下降(Bootstrap aggregating-down stochastic gradient descent,Bagging-Down SGD)算法重点来提高分布式深度网络的学习速率。Bagging-Down SGD算法通过在众多单机模型上加入速度控制器,对单机计算的参数值做统计处理,减少了参数更新的频率,并且可以使单机模型训练和参数更新在一定程度上分开,在保证训练精度的同时,提高了整个分布式模型的训练速度。该算法具有普适性,可以对多种类别的数据进行学习。 As a cutting-edge disruptive technology, deep learning and unsupervised learning have attracted a significant research attention, and it has been widely acknowledged that training big data with a distributed deep learning algorithm can get better structures. However, there are two main problems with traditional distributed deep learning algorithms: the speed of training is slow and the accuracy of training is low. The Bootstrap aggregating-down stochastic gradient descent (Bagging-Down SGD) algorithm is proposed to solve the speed problem mainly. We add a speed controller to update the parameters of the single machine statistically, and to split model training and parameters updating to improve the training speed with the assurance of the same accuracy. It is to be proved in the experiment that the algorithm has the generality to learn the structures of different kinds of data.
作者 秦超 高晓光 陈大庆 QIN Chao;GAO Xiaoguang;CHEN Daqing(School of Electronics and Information, Northwestern Poly-technical University,Xi’an 710100, China;London South Bank University, London SE10AA, England)
出处 《系统工程与电子技术》 EI CSCD 北大核心 2019年第5期1021-1027,共7页 Systems Engineering and Electronics
基金 国家自然科学基金(61573285)资助课题
关键词 深度网络 分布式 Bootstrap向下聚合随机梯度下降 速度控制器 deep network distributed Bootstrap aggregating-down stochastic gradient descent (Bagging-Down SGD) speed controller
  • 相关文献

参考文献1

二级参考文献15

共引文献11

同被引文献4

引证文献1

二级引证文献6

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部