自适应隐私预算分配的差分隐私梯度下降算法  

Differential Privacy Gradient Descent Algorithm with Adaptive Privacy Budget Allocation

在线阅读下载全文

作  者:李界雯 陈佳佳 李师毅 LI Jie-wen;CHEN Jia-jia;LI Shi-yi(School of Statistics,Shanxi University of Finance and Economics,Taiyuan 030006,China;School of Computer and Information Technology,Shanxi University,Taiyuan 030006,China)

机构地区:[1]山西财经大学统计学院,山西太原030006 [2]山西大学计算机与信息技术学院,山西太原030006

出  处:《数学的实践与认识》2024年第7期129-140,共12页Mathematics in Practice and Theory

基  金:国家自然科学基金(62072293);山西省基础研究计划项目(202103021223304,202303021221054);山西省研究生教育教学改革课题(2022YJJG010);山西省高等学校教学改革创新项目(J20220570);山西省回国留学人员科研教研资助项目(2024-002)。

摘  要:在大数据时代的背景下利用机器学习模型解决问题的同时也存在着敏感数据泄露的安全风险.利用差分隐私方法对模型训练过程进行保护是解决该问题的有效方式,现有的差分隐私梯度下降算法虽然已经取得了显著的成果,但是在其迭代训练过程中依旧存在不足之处.文章从隐私预算分配的角度提出了一种自适应选择学习率和隐私预算分配的差分隐私梯度下降算法.具体地,在保证模型收敛的前提下对每次迭代过程中的隐私预算进行自适应地分配,最大程度地对其进行隐私保护,防止模型参数的泄露.最后,实验结果验证了该算法的隐私性和有效性,并且在不同数据集中都表现出了优于已有算法的结果。In the era of big data,there are also security risks of sensitive data leakage when using machine learning models to solve problems.Using differential privacy method to protect the model training process is an effective way to solve this problem.Although the existing differential privacy gradient descent algorithm has achieved significant results,there are still shortcomings in its iterative training process.In this paper,we propose a differential privacy gradient descent algorithm for adaptively selecting the learning rate and privacy budget from the perspective of privacy budget allocation.Intuitively,at the beginning of the optimization,the gradients are expected to be large,so there is no need to measure them precisely.However,as the parameter approaches its optimal value,the gradient decreases and therefore needs to be measured more precisely.Finally,the experimental results verify the privacy and effectiveness of the proposed algorithm,and show better results than the existing algorithms in different data sets.

关 键 词:差分隐私 隐私预算分配 梯度下降 LOGISTIC回归 

分 类 号:TP309[自动化与计算机技术—计算机系统结构] TP181[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象