期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Stochastic Gradient Compression for Federated Learning over Wireless Network
1
作者 Lin Xiaohan Liu Yuan +2 位作者 Chen Fangjiong Huang Yang Ge Xiaohu 《China Communications》 SCIE CSCD 2024年第4期230-247,共18页
As a mature distributed machine learning paradigm,federated learning enables wireless edge devices to collaboratively train a shared AI-model by stochastic gradient descent(SGD).However,devices need to upload high-dim... As a mature distributed machine learning paradigm,federated learning enables wireless edge devices to collaboratively train a shared AI-model by stochastic gradient descent(SGD).However,devices need to upload high-dimensional stochastic gradients to edge server in training,which cause severe communication bottleneck.To address this problem,we compress the communication by sparsifying and quantizing the stochastic gradients of edge devices.We first derive a closed form of the communication compression in terms of sparsification and quantization factors.Then,the convergence rate of this communicationcompressed system is analyzed and several insights are obtained.Finally,we formulate and deal with the quantization resource allocation problem for the goal of minimizing the convergence upper bound,under the constraint of multiple-access channel capacity.Simulations show that the proposed scheme outperforms the benchmarks. 展开更多
关键词 federated learning gradient compression quantization resource allocation stochastic gradient descent(SGD)
下载PDF
Edge-Federated Self-Supervised Communication Optimization Framework Based on Sparsification and Quantization Compression
2
作者 Yifei Ding 《Journal of Computer and Communications》 2024年第5期140-150,共11页
The federated self-supervised framework is a distributed machine learning method that combines federated learning and self-supervised learning, which can effectively solve the problem of traditional federated learning... The federated self-supervised framework is a distributed machine learning method that combines federated learning and self-supervised learning, which can effectively solve the problem of traditional federated learning being difficult to process large-scale unlabeled data. The existing federated self-supervision framework has problems with low communication efficiency and high communication delay between clients and central servers. Therefore, we added edge servers to the federated self-supervision framework to reduce the pressure on the central server caused by frequent communication between both ends. A communication compression scheme using gradient quantization and sparsification was proposed to optimize the communication of the entire framework, and the algorithm of the sparse communication compression module was improved. Experiments have proved that the learning rate changes of the improved sparse communication compression module are smoother and more stable. Our communication compression scheme effectively reduced the overall communication overhead. 展开更多
关键词 Communication Optimization Federated Self-Supervision Sparsification gradient compression Edge Computing
下载PDF
Image Semantic Segmentation for Autonomous Driving Based on Improved U-Net
3
作者 Chuanlong Sun Hong Zhao +2 位作者 Liang Mu Fuliang Xu Laiwei Lu 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第7期787-801,共15页
Image semantic segmentation has become an essential part of autonomous driving.To further improve the generalization ability and the robustness of semantic segmentation algorithms,a lightweight algorithm network based... Image semantic segmentation has become an essential part of autonomous driving.To further improve the generalization ability and the robustness of semantic segmentation algorithms,a lightweight algorithm network based on Squeeze-and-Excitation Attention Mechanism(SE)and Depthwise Separable Convolution(DSC)is designed.Meanwhile,Adam-GC,an Adam optimization algorithm based on Gradient Compression(GC),is proposed to improve the training speed,segmentation accuracy,generalization ability and stability of the algorithm network.To verify and compare the effectiveness of the algorithm network proposed in this paper,the trained networkmodel is used for experimental verification and comparative test on the Cityscapes semantic segmentation dataset.The validation and comparison results show that the overall segmentation results of the algorithmnetwork can achieve 78.02%MIoU on Cityscapes validation set,which is better than the basic algorithm network and the other latest semantic segmentation algorithms network.Besides meeting the stability and accuracy requirements,it has a particular significance for the development of image semantic segmentation. 展开更多
关键词 Deep learning semantic segmentation attention mechanism depthwise separable convolution gradient compression
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部