期刊文献+

通过网内模型参数分发加速分布式模型的训练分析

Analysis of Accelerating Training of Distributed Models through Intra Network Model Parameter Distribution
原文传递
导出
摘要 阐述一种基于随机舍入的下行链路通信优化方案,其算法实现近似系数为O(log|V|),其中V为可编程交换机的数量。通过算法模拟的大规模实验结果表明,与最先进的解决方案相比,此方案可以减少14.5%~35.8%的下行链路通信开销。 This paper a downlink communication optimization scheme based on random rounding,with an algorithm implementation approximation coefficient of O(log|V|),where V is the number of programmable switches.The large-scale experimental results through algorithm simulation show that compared with state-of-the-art solutions,this scheme can reduce downlink communication overhead by 14.5%to 35.8%.
作者 詹韩峰 徐宏力 ZHAN Hanfeng;XU Hongli(School of Computer Science and Technology,University of Science and Technology of China,Anhui 230027,China;Suzhou Institute for Advanced Study,University of Science and Technology of China,Jiangsu 215123,China)
出处 《电子技术(上海)》 2024年第2期58-60,共3页 Electronic Technology
关键词 分布式模型训练 网内计算 多播传输 distributed model training intra network computing multicast transmission
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部