一种面向深度神经网络的差分隐私保护算法  被引量:4

Differential Privacy Algorithm under Deep Neural Networks

在线阅读下载全文

作  者:周治平[1,2] 钱新宇 ZHOU Zhiping;QIAN Xinyu(School of Internet of Things Engineering,Jiangnan University,Wuxi 214122,China;Engineering Research Center of Internet of Things Technology Applications of Ministry of Education,Jiangnan University,Wuxi 214122,China)

机构地区:[1]江南大学物联网工程学院,无锡214122 [2]江南大学物联网技术应用教育部工程研究中心,无锡214122

出  处:《电子与信息学报》2022年第5期1773-1781,共9页Journal of Electronics & Information Technology

摘  要:深度神经网络梯度下降过程中存在较大的梯度冗余,应用差分隐私机制抵御成员推理攻击时,会引入过量噪声。针对上述问题,该文利用Funk-SVD矩阵分解算法将梯度矩阵分解,分别在低维特征子空间矩阵和残差矩阵中添加噪声,利用梯度重构过程消除冗余梯度噪声。重新计算分解矩阵范数并结合平滑敏感度降低噪声规模。同时根据输入特征与输出相关性,将更多隐私预算分配给相关系数大的特征以提高训练精度。最后,根据分解矩阵范数均值提出一种自适应梯度剪裁算法以解决收敛缓慢的问题。算法利用时刻统计计算了在多种优化策略下的累计隐私损失。在标准数据集MNIST和CIFAR-10上验证了该文算法更有效地弥补了与非隐私模型之间的差距。Gradient redundancy exists in the process of deep neural network gradient descent.When differential privacy mechanism is applied to resist member inference attack,excessive noise will be introduced.So,the gradient matrix is decomposed by Funk-SVD algorithm and noise is added to the low-dimensional eigen subspace matrix and residual matrix respectively.The redundant gradient noise is eliminated in the gradient reconstruction process.The decomposition matrix norm is recalculated and the smoothing sensitivity is combined to reduce the noise scale.At the same time,according to the correlation between input features and output features,more privacy budget is allocated to features with large correlation coefficients to improve the training accuracy.The noise scale is reduced by recalculating the decomposition matrix norm and the smoothing sensitivity.Moment accountant is used to calculate the cumulative privacy loss under multiple optimization strategies.The results show that Deep neural networks under differential privacy based on Funk-SVD(FSDP)can bridge the gap with the non-privacy model more effectively on MNIST and CIFAR-10.

关 键 词:差分隐私 Funk-SVD 平滑敏感度 相关性 梯度剪裁 

分 类 号:TN918[电子电信—通信与信息系统] TP309[电子电信—信息与通信工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象