Anderson Acceleration of Gradient Methods with Energy for Optimization Problems  

在线阅读下载全文

作  者:Hailiang Liu Jia-Hao He Xuping Tian 

机构地区:[1]Department of Mathematics,Iowa State University,Ames,IA,USA [2]Department of Agricultural and Biosystems Engineering,Iowa State University,Ames,IA,USA

出  处:《Communications on Applied Mathematics and Computation》2024年第2期1299-1318,共20页应用数学与计算数学学报(英文)

基  金:partially supported by the National Science Foundation under(Grant DMS No.1812666)。

摘  要:Anderson acceleration(AA)is an extrapolation technique designed to speed up fixed-point iterations.For optimization problems,we propose a novel algorithm by combining the AA with the energy adaptive gradient method(AEGD)[arXiv:2010.05109].The feasibility of our algorithm is ensured in light of the convergence theory for AEGD,though it is not a fixed-point iteration.We provide rigorous convergence rates of AA for gradient descent(GD)by an acceleration factor of the gain at each implementation of AA-GD.Our experimental results show that the proposed AA-AEGD algorithm requires little tuning of hyperparameters and exhibits superior fast convergence.

关 键 词:Anderson acceleration(AA) Gradient descent(GD) Energy stability 

分 类 号:O24[理学—计算数学] O29[理学—数学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象