FastProtector:一种支持梯度隐私保护的高效联邦学习方法  被引量:2

FastProtector:An Efficient Federated Learning Method Supporting Gradient Privacy Protection

在线阅读下载全文

作  者:林莉 张笑盈 沈薇 王万祥 LIN Li;ZHANG Xiaoying;SHEN Wei;WANG Wanxiang(College of Computer Science,Faculty of Information Technology,Beijing University of Technology,Beijing 100124,China;Beijing Key Laboratory of Trusted Computing,Beijing 100124,China)

机构地区:[1]北京工业大学信息学部计算机学院,北京100124 [2]可信计算北京市重点实验室,北京100124

出  处:《电子与信息学报》2023年第4期1356-1365,共10页Journal of Electronics & Information Technology

基  金:国家自然科学基金(61502017);北京市教委科技计划一般项目(KM201710005024)。

摘  要:联邦学习存在来自梯度的参与方隐私泄露,现有基于同态加密的梯度保护方案产生较大时间开销且潜在参与方与聚合服务器合谋导致梯度外泄的风险,为此,该文提出一种新的联邦学习方法FastProtector,在采用同态加密保护参与方梯度时引入符号随机梯度下降(SignSGD)思想,利用梯度中正负的多数决定聚合结果也能使模型收敛的特性,量化梯度并改进梯度更新机制,降低梯度加密的开销;同时给出一种加性秘密共享方案保护梯度密文以抵抗恶意聚合服务器和参与方之间共谋攻击;在MNIST和CIFAR-10数据集上进行了实验,结果表明所提方法在降低80%左右加解密总时间的同时仍可保证较高的模型准确率。Federated learning has the problem of privacy leakage from the gradient.The existing gradient protection schemes based on homomorphic encryption incur a large time cost and the risk of gradient leakage caused by potential collusion between participants and aggregation server.A new federated learning method called FastProtector is proposed,where the idea of SignSGD is introduced when homomorphic encryption is used to protect participant gradients.Exploiting the feature that the majority of positive and negative gradients determine the aggregation result to make the model convergent,the gradient is quantified and the gradient updating mechanism is improved,which can reduce the overhead of gradient encryption.Meanwhile,an additive secret sharing scheme is proposed to protect the gradient ciphertext against collusion attacks between malicious aggregation servers and participants.Experiments on MNIST and CIFAR-10 dataset show that the proposed method can reduce the total encryption and decryption time by about 80%while ensuring high model accuracy.

关 键 词:低加密开销 共谋攻击 联邦学习 梯度保护 

分 类 号:TN918[电子电信—通信与信息系统] TP181[电子电信—信息与通信工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象