Differentially private SGD with random features  

在线阅读下载全文

作  者:WANG Yi-guang GUO Zheng-chu 

机构地区:[1]Polytechnic Institute of Zhejiang University,Zhejiang University,Hangzhou 310015,China [2]School of Mathematical Sciences,Zhejiang University,Hangzhou 310058,China

出  处:《Applied Mathematics(A Journal of Chinese Universities)》2024年第1期1-23,共23页高校应用数学学报(英文版)(B辑)

基  金:supported by Zhejiang Provincial Natural Science Foundation of China(LR20A010001);National Natural Science Foundation of China(12271473 and U21A20426)。

摘  要:In the realm of large-scale machine learning,it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization performance.Additionally,since the collected data may contain some sensitive information,it is also of great significance to study privacy-preserving machine learning algorithms.This paper focuses on the performance of the differentially private stochastic gradient descent(SGD)algorithm based on random features.To begin,the algorithm maps the original data into a lowdimensional space,thereby avoiding the traditional kernel method for large-scale data storage requirement.Subsequently,the algorithm iteratively optimizes parameters using the stochastic gradient descent approach.Lastly,the output perturbation mechanism is employed to introduce random noise,ensuring algorithmic privacy.We prove that the proposed algorithm satisfies the differential privacy while achieving fast convergence rates under some mild conditions.

关 键 词:learning theory differential privacy stochastic gradient descent random features reproducing kernel Hilbert spaces 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程] TP309[自动化与计算机技术—控制科学与工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象