期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Multi-GPU Based Recurrent Neural Network Language Model Training
1
作者 Xiaoci Zhang naijie gu Hong Ye 《国际计算机前沿大会会议论文集》 2016年第1期124-126,共3页
Recurrent neural network language models (RNNLMs) have been applied in a wide range of research fields, including nature language processing and speech recognition. One challenge in training RNNLMs is the heavy comput... Recurrent neural network language models (RNNLMs) have been applied in a wide range of research fields, including nature language processing and speech recognition. One challenge in training RNNLMs is the heavy computational cost of the crucial back-propagation (BP) algorithm. This paper presents an effective approach to train recurrent neural network on multiple GPUs, where parallelized stochastic gradient descent (SGD) is applied. Results on text-based experiments show that the proposed approach achieves 3.4× speedup on 4 GPUs than the single one, without any performance loss in language model perplexity. 展开更多
关键词 RECURRENT NEURAL network LANGUAGE MODELS (RNNLMs)
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部