Federated learning on non-IID and long-tailed data viadual-decoupling  被引量:1

在线阅读下载全文

作  者:Zhaohui WANG Hongjiao LI Jinguo LI Renhao HU Baojin WANG 

机构地区:[1]College of Computer Science and Technology,Shanghai University of Electric Power,Shanghai 201306,China

出  处:《Frontiers of Information Technology & Electronic Engineering》2024年第5期728-741,共14页信息与电子工程前沿(英文版)

基  金:supported by the National Natural Science Foundation of China(No.61702321)。

摘  要:Federated learning(FL),a cutting-edge distributed machine learning training paradigm,aims to generate a global model by collaborating on the training of client models without revealing local private data.The co-occurrence of non-independent and identically distributed(non-IID)and long-tailed distribution in FL is one challenge that substantially degrades aggregate performance.In this paper,we present a corresponding solution called federated dual-decoupling via model and logit calibration(FedDDC)for non-IID and long-tailed distributions.The model is characterized by three aspects.First,we decouple the global model into the feature extractor and the classifier to fine-tune the components affected by the joint problem.For the biased feature extractor,we propose a client confidence re-weighting scheme to assist calibration,which assigns optimal weights to each client.For the biased classifier,we apply the classifier re-balancing method for fine-tuning.Then,we calibrate and integrate the client confidence re-weighted logits with the re-balanced logits to obtain the unbiased logits.Finally,we use decoupled knowledge distillation for the first time in the joint problem to enhance the accuracy of the global model by extracting the knowledge of the unbiased model.Numerous experiments demonstrate that on non-IID and long-tailed data in FL,our approach outperforms state-of-the-art methods.

关 键 词:Federated learning Non-IID Long-tailed data Decoupling learning Knowledge distillation 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象