期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
xCCL:A Survey of Industry-Led Collective Communication Libraries for Deep Learning
1
作者 adam weingram 李雨珂 +3 位作者 戚昊 Darren Ng 代柳瑶 鲁小亿 《Journal of Computer Science & Technology》 SCIE EI CSCD 2023年第1期166-195,共30页
Machine learning techniques have become ubiquitous both in industry and academic applications.Increasing model sizes and training data volumes necessitate fast and efficient distributed training approaches.Collective ... Machine learning techniques have become ubiquitous both in industry and academic applications.Increasing model sizes and training data volumes necessitate fast and efficient distributed training approaches.Collective communications greatly simplify inter-and intra-node data transfer and are an essential part of the distributed training process as information such as gradients must be shared between processing nodes.In this paper,we survey the current state-of-the-art collective communication libraries(namely xCCL,including NCCL,oneCCL,RCCL,MSCCL,ACCL,and Gloo),with a focus on the industry-led ones for deep learning workloads.We investigate the design features of these xCCLs,discuss their use cases in the industry deep learning workloads,compare their performance with industry-made benchmarks(i.e.,NCCL Tests and PARAM),and discuss key take-aways and interesting observations.We believe our survey sheds light on potential research directions of future designs for xCCLs. 展开更多
关键词 COLLECTIVE deep learning distributed training GPUDirect RDMA(remote direct memory access)
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部