基于Transformer的多模态个性化联邦学习  

Multimodal personalized federated learning based on Transformer

在线阅读下载全文

作  者:曹行健 孙罡 虞红芳 CAO Xingjian;SUN Gang;YU Hongfang(School of Inform at on and Communicai on Engineering,Uaiversity of Electronic Science and'Technology of China,Chengau 611731,China)

机构地区:[1]电子科技大学信息与通信工程学院,成都611731

出  处:《电子科技大学学报》2025年第2期242-249,共8页Journal of University of Electronic Science and Technology of China

基  金:国家重点研发计划(2021YFB3101001)。

摘  要:在当前物联网飞速发展的背景下,处理来自各种信息采集设备的多模态数据,尤其是视觉、听觉信号和文本等多元感官信息的数据,对于机器学习落地应用至关重要。Transformer架构和其衍生的大模型在自然语言处理和计算机视觉中的卓越表现推动了对复杂多模态数据处理能力的追求。然而,这也带来了数据隐私安全和满足个性化需求的挑战。为解决这些挑战,提出一种基于多模态Transformer的个性化联邦学习方法,它支持异构数据模态的联邦学习,在保护参与方数据隐私的前提下为其训练更符合其个性化需求的多模态模型。该方法显著提升了多模态个性化模型的性能:相较于对比方法,准确率提高了15%,这标志着多模态个性化联邦学习在应用场景限制上的突破。In the context of the current rapid development of the Internet of Things,processing multi-modaldata from various information collection devices,especially data from multi-sensory information such as visual,auditory signals and text,is crucial for the applications of machine learning.The outstanding performance of theTransformer architecture and its derived large models in natural language processing and computer vision haspromoted the pursuit of complex multi-modal data processing capabilities.However,this also brings the challengesof data privacy security and meeting personalized needs.In order to solve these challenges,this paper proposes apersonalized federated learning method based on multi-modal Transformer,which supports federated learning ofheterogeneous data modalities,and its training is more consistent with its purpose while protecting the data privacyof the participants.The proposed method significantly improves the performance of the multi-modal personalizedmodel,its accuracy is increased by 15%compared with the comparative method,which marks a breakthrough inthe application scenario limitations of multi-modal personalized federated learning.

关 键 词:多模态 TRANSFORMER 联邦学习 个性化 

分 类 号:TP301[自动化与计算机技术—计算机系统结构]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象