高效联邦学习:范数加权聚合算法  

Efficient federated learning:norm-weighted aggregation algorithm

在线阅读下载全文

作  者:陈攀 张恒汝[1,2] 闵帆 Chen Pan;Zhang Hengru;Min Fan(School of Computer Science,Southwest Petroleum University,Chengdu 610500,China;Laboratory of Machine Learning,Southwest Petroleum University,Chengdu 610500,China)

机构地区:[1]西南石油大学计算机科学学院,成都610500 [2]西南石油大学机器学习研究中心,成都610500

出  处:《计算机应用研究》2024年第3期694-699,共6页Application Research of Computers

基  金:国家自然科学基金资助项目(61902328);南充市科技局应用基础研究项目(SXHZ040,SXHZ051)。

摘  要:在联邦学习中,跨客户端的非独立同分布(non-IID)数据导致全局模型收敛较慢,通信成本显著增加。现有方法通过收集客户端的标签分布信息来确定本地模型的聚合权重,以加快收敛速度,但这可能会泄露客户端的隐私。为了在不泄露客户端隐私的前提下解决non-IID数据导致的收敛速度降低的问题,提出FedNA聚合算法。该算法通过两种方法来实现这一目标。第一,FedNA根据本地模型类权重更新的L 1范数来分配聚合权重,以保留本地模型的贡献。第二,FedNA将客户端的缺失类对应的类权重更新置为0,以缓解缺失类对聚合的影响。在两个数据集上模拟了四种不同的数据分布进行实验。结果表明,与FedAvg相比,FedNA算法达到稳定状态所需的迭代次数最多可减少890次,降低44.5%的通信开销。FedNA在保护客户端隐私的同时加速了全局模型的收敛速度,降低了通信成本,可用于需要保护用户隐私且对通信效率敏感的场景。In federated learning,the non-independent and identically distributed(non-IID)data across clients leads to slower convergence of the global model and significantly increases communication costs.Existing methods collect information about the label distribution of clients to determine aggregation weights for local models,accelerating convergence,but this may leak clients’privacy.To address the slower convergence caused by non-IID data without leaking clients’privacy,this paper proposed the FedNA aggregation algorithm.FedNA achieved this goal in two ways.Firstly,it assigned aggregation weights based on the L 1 norm of the class weight updates of local models to retain their contributions.Secondly,it set the class weight updates corresponding to missing classes at the clients to 0 to mitigate their impact on aggregation.Experiments were conducted under four different data distributions on two datasets.The results show that compared to FedAvg,the FedNA algorithm can reduce the number of iterations required to reach steady state by 890 at best,lowering communication costs by 44.5%.FedNA maintains clients’privacy while accelerating the convergence of the global model and decreasing communication costs.It is suitable for situations that need to protect clients privacy and are sensitive to communication efficiency.

关 键 词:联邦学习 通信成本 隐私保护 非独立同分布 聚合 权重更新 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象