注意力机制驱动的个性化联邦学习特征分离方法  

Attention-driven feature separation method for personalized federated learning

在线阅读下载全文

作  者:张晓琴 金西兴 陆艳军 曹泽宇 Zhang Xiaoqin;Jin Xixing;Lu Yanjun;Cao Zeyu(School of Computer Science&Engineering,Chongqing University of Technology,Chongqing 400054,China;Chongqing Communication Design Institute Co.,Ltd.,Chongqing 400041,China)

机构地区:[1]重庆理工大学计算机科学与工程学院,重庆400054 [2]重庆市信息通信咨询设计院有限公司,重庆400041

出  处:《计算机应用研究》2025年第4期1102-1107,共6页Application Research of Computers

基  金:重庆市技术创新与应用发展专项重点资助项目(CSTB2022TIAD-KPX0054);重庆理工大学研究生教育高质量发展项目(gzlcx20243154)。

摘  要:提出了一种名为注意力机制驱动的个性化联邦学习特征分离方法(attention-driven feature separation method for personalized federated learning,FedAM),旨在解决传统联邦学习在高度异构数据环境下模型收敛性差和缺乏个性化解决方案的问题。FedAM通过将模型分解为特征提取层和模型头部,加入注意力模块以分别提取全局和个性化信息,从而实现全局与个性化特征的自适应动态分离。此外,FedAM引入相关性对齐损失来平衡个性化与泛化能力。实验结果表明,FedAM展现出卓越的性能,不仅在客户端频繁掉线的情况下保持稳健表现,还通过灵活应对异构数据环境,显著提升了个性化和泛化效果。FedAM有效提升了联邦学习模型的整体性能和适应性,为复杂的联邦学习场景提供了有力支持。This paper proposed an FedAM to address the challenges of poor model convergence and the lack of personalized solutions in highly heterogeneous data environments faced by traditional federated learning.FedAM achieved adaptive,dyna-mic separation of global and personalized features by decomposing the model into a feature extraction layer and a model head,with an added attention module to extract global and personalized information separately.Additionally,FedAM incorporated correlation alignment loss to balance personalization and generalization capabilities.Experimental results demonstrate that FedAM exhibits outstanding performance,maintaining robust results even with frequent client dropouts,and flexibly adapting to heterogeneous data environments,thereby significantly enhancing both personalization and generalization.FedAM effectively improves the overall performance and adaptability of federated learning models,providing strong support for complex federated learning scenarios.

关 键 词:数据异构 注意力机制 参数分离 个性化联邦学习 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象