摘要
卷积在统计学、信号处理、图像处理、深度学习等领域有着广泛的应用,且起到了至关重要的作用。在深度神经网络中,使用卷积运算对输入信息进行特征提取的方法是实现神经网络的基础计算单元之一。如何优化卷积的运算速度,提高卷积计算效率一直是亟需探讨的问题。近年来,很多研究指出分布式计算架构可以提高卷积神经网络的计算速度,进而优化深度学习的训练效率,然而由于分布式系统中普遍存在落跑者问题(straggler),该问题可能会拖慢整个系统执行任务的时间,因此该问题也成为了分布式深度学习中一个待解决的问题。文中针对二维卷积计算,结合Winograd算法和分布式编码,提出了一种优化的分布式二维卷积算法。Winograd算法能够有效地加速单次二维卷积计算的速度,分布式编码通过使用一种基于分布式冗余的编码方式能够缓解straggler节点对整个分布式系统计算延迟的影响。因此,提出的分布式二维卷积算法可以在加速二维卷积计算的同时有效缓解分布式系统中的straggler问题,有效提高了分布式卷积的计算效率。
Convolution operation plays a vital role in statistics,signal processing,image processing and deep learning.It is also an important operation in deepneural networks where it is the basis of information filter and characteristics extraction.The exploration of methods to speed up the convolutional operations has become an open research topic in recent years.Many studies have pointed out that distributed computing framework may improve the computational time of convolution operations and hence optimize the training efficiency for deep learning.But stragglers in distributed systems may slow down the overall system.This paper proposes a distributed coding based Winograd algorithm for implementing 2D convolution operations.In this algorithm,the Winograd algorithm can effectively accelerate the speed of 2D convolution calculation,while distributed encoding can mitigate the impact of stragglers by using a redundancy-based encoding strategy.Therefore,the proposed distributed 2D convolution algorithm can effectively mitigate the straggler problem in distributed systems while improving the 2D convolution calculation,hence it may effectively improve the computational efficiency of distributed 2D convolution algorithms.
作者
苑晨宇
谢在鹏
朱晓瑞
屈志昊
徐媛媛
YUAN Chen-yu;XIE Zai-peng;ZHU Xiao-rui;QU Zhi-hao;XU Yuan-yuan(School of Computer and Information,Hohai University,Nanjing 211100,China)
出处
《计算机科学》
CSCD
北大核心
2021年第2期47-54,共8页
Computer Science
基金
国家重点研发课题(2016YFC0402710)
国家自然科学基金重点项目(61832005)。