一个新的无约束优化超记忆梯度算法(英文)  被引量:24

A New Super-memory Gradient Method for Unconstrained Optimization

在线阅读下载全文

作  者:时贞军[1] 

机构地区:[1]曲阜师范大学运筹与管理学院

出  处:《数学进展》2006年第3期265-274,共10页Advances in Mathematics(China)

基  金:The work was supported by NSFC(No. 10171054),Postdoctoral Fund of China and K. C. Wong Postdoctoral Fund of CAS(No. 6765700)

摘  要:本文提出一种新的无约束优化超记忆梯度算法,算法利用当前点的负梯度和前一点的负梯度的线性组合为搜索方向,以精确线性搜索和Armijo搜索确定步长.在很弱的条件下证明了算法具有全局收敛性和线性收敛速度.因算法中避免了存贮和计算与目标函数相关的矩阵,故适于求解大型无约束优化问题.数值实验表明算法比一般的共轭梯度算法有效.A new super-memory gradient method for unconstrained optimization problem is proposed. The algorithm uses the linear combination of negative gradient and its previous negative gradient as a search direction, and uses exact line search or inexact line search to define the step-size at each iteration. It is suitable to solve large scale unconstrained optimization problems because it avoids the computation and storage of matrices associated with the Hessian of objective functions. The convergence of the algorithm with exact line search is proved. Furthermore, the global convergence is also proved under Armijo line search.Numerical experiments show that the algorithm is efficient in practical computation in many situations.

关 键 词:无约束优化 超记忆梯度算法 全局收敛性 数值实验 

分 类 号:O221.2[理学—运筹学与控制论]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象