检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:高雨佳 王鹏飞 刘亮[2,3] 马华东[1,2] Gao Yujia;Wang Pengfei;Liu Liang;Ma Huadong(School of Computer Science(National Pilot Software Engineering School),Beijing University of Posts and Telecommunications,Beijing 100876;Beijing Key Laboratory of Intelligent Telecommunications Software and Multimedia(Beijing University of Posts and Telecommunications),Beijing 100876;School of Artificial Intelligence,Beijing University of Posts and Telecommunications,Beijing 100876)
机构地区:[1]北京邮电大学计算机学院(国家示范性软件学院),北京100876 [2]智能通信软件与多媒体北京市重点实验室(北京邮电大学),北京100876 [3]北京邮电大学人工智能学院,北京100876
出 处:《计算机研究与发展》2024年第1期196-208,共13页Journal of Computer Research and Development
基 金:国家自然科学基金项目(61932013,62061146002,62225204);国家111计划项目(B18008)。
摘 要:联邦学习作为一种分布式机器学习框架,客户端可以在不向服务器传输数据的情况下进行全局模型训练,解决了数据分散和数据隐私的问题.联邦学习可以在具有相似数据特征和分布的客户端上很好地工作.但是在很多场景中,客户端数据在分布、数量和概念上的不同,造成了全局模型训练困难.为此,个性化联邦学习作为一种新的联邦学习范式被提出,它旨在通过客户端与服务器的协作来保证客户端个性化模型的有效性.直观来讲,为具有相似数据特征和分布的客户端提供更紧密的协作关系可以有利于个性化模型的构建.然而,由于客户端数据的不可见性,如何细粒度地提取客户端特征,并定义它们之间的协作关系是一个挑战.设计了一个注意力增强元学习网络(attention-enhanced meta-learning network,AMN)来解决这个问题.AMN可以利用客户基础模型参数作为输入特征,训练元学习网络为每个客户端提供一个额外的元模型,自动分析客户特征相似性.基于双层网络设计,有效地实现客户端个性与共性的权衡,提供了包含有益客户信息的融合模型.考虑到训练过程中需要同时训练元学习网络和客户本地基础网络,设计了一种交替训练策略,以端到端的方式进行训练.为了证明该方法的有效性,在2个基准数据集和8种基准方法上进行了大量实验,相较于现有表现最优的个性化联邦学习方法,该方法在2个数据集中平均分别提升了3.39%和2.45%的模型性能.Federated learning is a distributed machine learning framework,which enables clients to conduct model training without transmitting their data to the servers.It is used to solve the dilemma of data silos and data privacy.It can work well on clients with similar data characteristics and distribution.However,in many scenarios,the differences of data distribution cause difficulties in global model training.Therefore,personalized federated learning is proposed as a new federated learning paradigm.It aims to guarantee the effectiveness of client personalized models through the collaboration between clients and the servers.Intuitively,providing a tighter collaboration for clients with similar data characteristics and distribution can facilitate the construction of personalized models.However,due to the invisibility of client data,it is a challenge to extract client features at a fine-grained level and define the collaborative relationships between clients.In this paper,we design an attention-enhanced meta-learning network(AMN)to address this issue.AMN can utilize model parameters as features and train the meta-learning network to provide an additional metamodel for each client to automatically analyze client feature similarity.According to two-layers framework of AMN,a trade-off between clients’personality and commonality can be reasonably achieved,and a hybrid model with useful information from all clients is provided.Considering the need to maintain both the meta-model and the client’s local model during the training process,we design an alternative training strategy to perform the training in an end-to-end manner.To demonstrate the effectiveness of our method,we conduct extensive experiments on two benchmark datasets and eight baseline methods.Compared with the existing best-performing personalized federated learning methods,our method improves the accuracy rate by 3.39%and 2.45%on average in two datasets.
关 键 词:联邦学习 注意力机制 深度学习 元学习 分布式机器学习
分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222