基于元蒸馏的个性化联邦学习算法  

A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation

在线阅读下载全文

作  者:孙艳华[1,2] 史亚会 王朱伟[1,2] 李萌[1,2] 司鹏搏[1,2] SUN Yanhua;SHI Yahui;WANG Zhuwei;LI Meng;SI Pengbo(School of Information Communication Engineering,Beijing University of Technology,Beijing 100124,China;Beijing Laboratory of Advanced Information Networks,Beijing University of Technology,Beijing 100124,China)

机构地区:[1]北京工业大学信息与通信工程学院,北京100124 [2]北京工业大学先进信息网络北京实验室,北京100124

出  处:《北京邮电大学学报》2023年第1期12-18,共7页Journal of Beijing University of Posts and Telecommunications

基  金:北京市自然科学基金项目(L202016)。

摘  要:联邦学习(FL)中客户端数据异构导致训练的统一模型无法满足每个客户端对性能的需求。针对这一问题,提出了一种个性化联邦学习算法——元蒸馏联邦学习,将知识蒸馏和元学习与FL结合,并将个性化过程嵌入FL。在每次全局迭代中,每个客户端的本地模型(即学生模型)在蒸馏全局模型(即教师模型)的同时将自身情况反馈给教师模型并使其不断更新,从而获得一个更优的教师模型以进行个性化学习。仿真结果表明,与现有个性化算法相比,所提算法在提高个性化精度的同时能在全局精度和个性化精度之间取得较好的折中。In federated learning(FL),the distribution of data in clients is always heterogeneous,which makes the unified model trained in FL unable to meet the demand of each client.To combat this issue,a personalized federated learning algorithm with meta learning and knowledge distillation is proposed,in which the knowledge distillation and meta-learning with FL and incorporating the personalization are combined into the training of FL.In each global iteration,the global model(teacher model)update itself according to the feedback from the local model(student model)during the knowledge distillation.Therefore,each client can obtain a better personalized model.Simulation results show that compared with the existing personalized algorithms,the proposed algorithm can achieve a better compromise between global accuracy and personalization accuracy while improving the personalization accuracy.

关 键 词:联邦学习 元学习 知识蒸馏 个性化 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象