AccFed:物联网中基于模型分割的联邦学习加速  被引量:1

AccFed:Federated Learning Acceleration Based on Model Partitioning in Internet of Things

在线阅读下载全文

作  者:曹绍华[1] 陈辉[1] 陈舒 张汉卿 张卫山[1] CAO Shaohua;CHEN Hui;CHEN Shu;ZHANG Hanqing;ZHANG Weishan(College of Computer Science and Technology,China University of Petroleum(East China),Qingdao 266580,China)

机构地区:[1]中国石油大学(华东)计算机科学与技术学院,青岛266580

出  处:《电子与信息学报》2023年第5期1678-1687,共10页Journal of Electronics & Information Technology

基  金:国家自然科学基金(62072469);研究生创新工程项目(YCX2021129);中国科学院自动化研究所复杂系统管理与控制国家重点实验室开放课题(20210114)。

摘  要:随着物联网(IoT)的快速发展,人工智能(AI)与边缘计算(EC)的深度融合形成了边缘智能(Edge AI)。但由于IoT设备计算与通信资源有限,并且这些设备通常具有隐私保护的需求,那么在保护隐私的同时,如何加速Edge AI仍然是一个挑战。联邦学习(FL)作为一种新兴的分布式学习范式,在隐私保护和提升模型性能等方面,具有巨大的潜力,但是通信及本地训练效率低。为了解决上述难题,该文提出一种FL加速框架AccFed。首先,根据网络状态的不同,提出一种基于模型分割的端边云协同训练算法,加速FL本地训练;然后,设计一种多轮迭代再聚合的模型聚合算法,加速FL聚合;最后实验结果表明,AccFed在训练精度、收敛速度、训练时间等方面均优于对照组。With the rapid development of Internet of Things(IoT),the deep integration of Artificial Intelligence(AI)and Edge Computing(EC)has formed Edge AI.However,since IoT devices are computationally and communicationally constrained and these devices often require privacy-preserving,it is still a challenge to accelerate Edge AI while protecting privacy.Federated Learning(FL),an emerging distributed learning paradigm,has great potential in terms of privacy preservation and improving model performance,but communication and local training are inefficient.To address the above challenges,a FL acceleration framework AccFed is proposed in this paper.Firstly,a Device-Edge-Cloud synergy training algorithm based on model partitioning is proposed to accelerate FL local training according to the different network states;Then,a multiiteration and reaggregation algorithm is designed to accelerate FL aggregation;Finally,experimental results show that AccFed outperforms the control group in terms of training accuracy,convergence speed,training time,etc.

关 键 词:边缘智能 联邦学习 端边云协同 模型分割 

分 类 号:TN929.5[电子电信—通信与信息系统] TP399[电子电信—信息与通信工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象