期刊文献+

基于卷积神经网络的GFW加速调度算法

GFW Accelerated Scheduling Algorithm Based on Convolutional Neural Network
下载PDF
导出
摘要 神经网络的广泛应用使得人们更加关注神经网络的训练,更高精度的要求给神经网络的训练带来了困难,因此加速神经网络的训练成为了研究的重点。对于神经网络的训练中卷积层占据了大部分的训练时间,所以加速卷积层的训练成为了加速神经网络的关键。本文提出了GFW加速调度算法,GFW算法通过对不同卷积图像的大小和卷积核的数量调用不同的卷积算法,以达到整体的最佳训练效果。实验中具体分析了9层卷积网络的加速训练,实验结果显示,相比于GEMM卷积算法,GFW算法实现了2.901倍的加速,相比于FFT算法GFW算法实现了1.467倍的加速,相比于Winograd算法,GFW算法实现了1.318倍的加速。 The wide application of neural networks makes people pay more attention to the training of neural networks. The requirement of higher precision brings difficulties to the training of neural networks. Therefore, the training of accelerated neural networks has become the focus of research. For the training of neural networks, the convolutional layer occupies most of the training time, so the training of the accelerated convolution network becomes the key to accelerate the neural network. In this paper, the GFW accelerated scheduling algorithm is proposed. The GFW algorithm calls different convolution algorithms on the size of different convolution images and the number of convolution kernels to achieve the overall optimal training effect. In the experiment, the acceleration training of the 9-layer convolutional network is analyzed in detail. The experimental results show that compared with the GEMM convolution algorithm, the GFW algorithm achieves 2.901 times acceleration;compared with the FFT algorithm , the GFW algorithm achieves 1.467 times acceleration;Compared to the Winograd algorithm, the GFW algorithm achieves a 1.318x acceleration.
作者 宋铁 SONG Tie(School of Optical-Electrical and Computer Engineering, University of Shanghaifor Science and Technology, Shanghai, 200093, China)
出处 《软件》 2019年第3期217-221,共5页 Software
关键词 卷积神经网络 GEMM FFT Winograd算法 GFW调度算法 Convolutional neural network GEMM FFT Winograd algorithm GFW scheduling algorithm
  • 相关文献

参考文献2

二级参考文献5

共引文献46

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部