Wolfe线搜索下一类记忆梯度算法的全局收敛性(英文)  

Global Convergence of a Class of Memory Gradient Method with the Wolfe Line Search

在线阅读下载全文

作  者:陈翠玲 韩彩虹 罗荔龄 陈玉 CHEN Cuiling;HAN Caihong;LUO Liling;CHEN Yu(College of Mathematics and Statistics,Guangzi Normal University,Guilin 541004,China;School of Computing and Information,University of Pittsburgh,Pittsburgh 15238,USA)

机构地区:[1]广西师范大学数学与统计学院,广西桂林541004 [2]匹兹堡大学计算与信息学院,美国宾州匹兹堡15238

出  处:《应用数学》2018年第4期884-889,共6页Mathematica Applicata

基  金:Supported by the National Natural Science Foundation of China(11761014);Guangxi Natural Science Foundation(2017GXNSFAA198243);Guangxi Basic Ability Improvement Project for the Middle-aged,Young Teachers of Colleges and Universities(2017KY0068,KY2016YB069);Guangxi Higher Education Undergraduate Course Teaching Reform Project(2017JGB147)

摘  要:在本文中,首先我们提出一个记忆梯度算法,并讨论其在Wolfe线搜索下的下降性和全局收敛性.进一步地,我们将此算法推广到更一般的情形.最后,我们对这类记忆梯度方法的数值表现进行测试,并与PRP, FR, HS, LS, DY和CD共轭梯度法进行比较,数值结果表明这类算法是有效的.In this paper, we first propose a memory gradient algorithm, and discuss its descent property and global convergence under the Wolfe line search. Further, we extend the algorithm to a more general form. Finally we test the numerical performance and compare the numerical results of this class of algorithms with the PRP, FR, HS, LS, DY and CD conjugate gradient methods, which indicate the class of algorithms proposed in this paper are effective.

关 键 词:无约束优化 记忆梯度方法 WOLFE线搜索 全局收敛性 

分 类 号:O221[理学—运筹学与控制论]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象