检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:叶建豪 陈鸿升 郭子腾 YE Jianhao;CHEN Hongsheng;GUO Ziteng(College of Computer Science,Dongguan University of Technology,Dongguan 523000,China)
机构地区:[1]东莞理工学院计算机科学与技术学院,广东东莞523000
出 处:《运筹与管理》2024年第7期119-122,共4页Operations Research and Management Science
基 金:国家自然科学基金面上项目(11961011,11971106)。
摘 要:近年来,随着机器学习、模糊理论、神经网络等热门领域的发展以及计算机技术的日益成熟,优化方法越来越受重视,共轭梯度法也吸引了更多学者进行深入学习和研究。目前对共轭梯度法的研究主要分为两类,第一类是直接对共轭梯度参数进行改进,第二类是将不同的共轭梯度法进行混合,例如将两种现有的共轭梯度法进行凸组合,尝试构造新算法。对于不同的混合方法,其优缺点和收敛性特征等方面存在差异。在本文中,基于两项下降的PRP方法和三项下降的PRP方法,我们提出一类下降的PRP方法,当参数取特定值时,方法分别是两项下降的PRP方法和三项下降的PRP方法。而且算法不依赖于线搜索具有充分下降性质。在适当条件下,我们证明算法在Armijo型线搜索下具有全局收敛性。数值实验测试了大规模无约束优化问题,结果表明算法是有效的。Optimization methods have been developed for recent decades,primarily using mathematical approaches to study the optimization paths and solutions for various systems,and providing a scientific basis for decision-makers.The purpose of optimization methods is to find the best plan for the rational use of human,material,and financial resources for the system under study,enhance and improve the system’s efficiency and benefits,and ultimately achieve the optimal goal of the system.Optimization methods can be further divided into unconstrained optimization methods and constrained optimization methods.The unconstrained optimization methods include the steepest descent method,Newton’s method,conjugate direction method,as well as the Conjugate Gradient method and the variable metric method.The constrained optimization methods include the simplex method,the graphical method for solving linear programming,the penalty function method for equality constraints,and the Rosen gradient projection method,among others.The Conjugate Gradient method only requires the use of first-order derivative information,but it overcomes the slow convergence of the Steepest Descent method and avoids the drawbacks of the Newton method,which requires storage and computation of the Hessian matrix and its inverse.It is characterized by low memory requirements and simple iterations,making it an effective method for solving large-scale unconstrained optimization problems.Different conjugate gradient parameters correspond to different conjugate gradient methods.In recent years,with the development of hot fields such as machine learning,fuzzy theory,neural networks,and the increasing maturity of computer technology,optimization methods have been increasingly valued,and the conjugate gradient method naturally has attracted more scholars for in-depth study and research.Current research on the conjugate gradient method is mainly divided into two categories.The first one is to directly improve the conjugate gradient parameters,and the second one is to mi
关 键 词:PRP方法 ARMIJO型线搜索 全局收敛性 无约束优化
分 类 号:O224[理学—运筹学与控制论]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7