An Overview of Stochastic Quasi-Newton Methods for Large-Scale Machine Learning  被引量:2

在线阅读下载全文

作  者:Tian-De Guo Yan Liu Cong-Ying Han 

机构地区:[1]School of Mathematical Sciences,University of Chinese Academy of Sciences,Beijing 101408,China [2]School of Statistics and Data Science,KLMDASR,LEBPS,and LPMC,Nankai University,Tianjin 300071,China

出  处:《Journal of the Operations Research Society of China》2023年第2期245-275,共31页中国运筹学会会刊(英文)

基  金:the National Key R&D Program of China(No.2021YFA1000403);the National Natural Science Foundation of China(Nos.11731013,12101334 and U19B2040);the Natural Science Foundation of Tianjin(No.21JCQNJC00030);the Fundamental Research Funds for the Central Universities。

摘  要:Numerous intriguing optimization problems arise as a result of the advancement of machine learning.The stochastic first-ordermethod is the predominant choicefor those problems due to its high efficiency.However,the negative effects of noisy gradient estimates and high nonlinearity of the loss function result in a slow convergence rate.Second-order algorithms have their typical advantages in dealing with highly nonlinear and ill-conditioning problems.This paper provides a review on recent developments in stochastic variants of quasi-Newton methods,which construct the Hessian approximations using only gradient information.We concentrate on BFGS-based methods in stochastic settings and highlight the algorithmic improvements that enable the algorithm to work in various scenarios.Future research on stochastic quasi-Newton methods should focus on enhancing its applicability,lowering the computational and storage costs,and improving the convergence rate.

关 键 词:Stochastic quasi-Newton methods BFGS Large-scale machine learning 

分 类 号:O17[理学—数学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象