Stochastic Gradient Compression for Federated Learning over Wireless Network  

在线阅读下载全文

作  者:Lin Xiaohan Liu Yuan Chen Fangjiong Huang Yang Ge Xiaohu 

机构地区:[1]School of Electronic and Information Engineering,South China University of Technology,Guangzhou 510641,China [2]Key Laboratory of Dynamic Cognitive System of Electromagnetic Spectrum Space,Ministry of Industry and Information Technology,Nanjing University of Aeronautics and Astronautics,Nanjing 210016,China [3]School of Electronic Information and Communications,Huazhong University of Science and Technology,Wuhan 430074,China

出  处:《China Communications》2024年第4期230-247,共18页中国通信(英文版)

基  金:supported in part by the National Key Research and Development Program of China under Grant 2020YFB1807700;in part by the National Science Foundation of China under Grant U200120122

摘  要:As a mature distributed machine learning paradigm,federated learning enables wireless edge devices to collaboratively train a shared AI-model by stochastic gradient descent(SGD).However,devices need to upload high-dimensional stochastic gradients to edge server in training,which cause severe communication bottleneck.To address this problem,we compress the communication by sparsifying and quantizing the stochastic gradients of edge devices.We first derive a closed form of the communication compression in terms of sparsification and quantization factors.Then,the convergence rate of this communicationcompressed system is analyzed and several insights are obtained.Finally,we formulate and deal with the quantization resource allocation problem for the goal of minimizing the convergence upper bound,under the constraint of multiple-access channel capacity.Simulations show that the proposed scheme outperforms the benchmarks.

关 键 词:federated learning gradient compression quantization resource allocation stochastic gradient descent(SGD) 

分 类 号:TN92[电子电信—通信与信息系统] TP181[电子电信—信息与通信工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象