首页 | 官方网站   微博 | 高级检索  
     

超记忆梯度算法的全局收敛性
引用本文:杜守强,王春杰,陈元媛.超记忆梯度算法的全局收敛性[J].上海第二工业大学学报,2006,23(2):142-146.
作者姓名:杜守强  王春杰  陈元媛
作者单位:1. 青岛大学数学科学院,青岛,266071
2. 山东外贸职业学院,青岛,266071
摘    要:对无约束优化算法进行了研究。描述了最速下降算法、牛顿法、非线性FR共轭梯度法、非线性PRP共轭梯度法、非线性DY共轭梯度法等求解大规模无约束优化问题的有效算法以及精确线搜索、Wolfe线搜索、Armijo线搜索的搜索条件;着重研究了计算更为有效的适合求解无约束优化问题的超记忆梯度算法;在一类Wolfe型非精确线搜索条件下给出了一类超记忆梯度算法,并且在较弱的条件下证明了算法的全局收敛性,为求解大规模无约束优化问题以及各种算法的比较提供了参考。

关 键 词:无约束优化  超记忆梯度法  全局收敛性
文章编号:1001-4543(2006)02-0142-05
收稿时间:2004-12-02
修稿时间:2005-04-10

Global Convergence of Supermemory Gradient Method
DU Shou-qiang,WANG Chun-jie,CHEN Yuan-yuan.Global Convergence of Supermemory Gradient Method[J].Journal of Shanghai Second Polytechnic University,2006,23(2):142-146.
Authors:DU Shou-qiang  WANG Chun-jie  CHEN Yuan-yuan
Affiliation:1.College of Mathematics, Qingdao University, Qingdao 266071, Shandong, P.R.China ;2.Shandong Foreign Trade College, Qingdao 266071, Shandong, P.R.China
Abstract:The unconstrained optimization problem was discussed.The steepest descent method,Newton method,nonlinear FR conjugate gradient method,nonlinear PRP conjugate gradient method,nonlinear DY conjugate gradient and etc,which are for the large scale unconstrained optimization problems,were described.Also discussed exact line search,Wolfe line search and Armijo line search.Super memory gradient method,which is one of the efficient methods for solving unconstrained optimization problems,was stressed in this article.A class of super memory gradient algorithm under the condition of Wolfe-type line search was presented.Its global convergence was also given under mild conditions.This will improve the efficiency of the unconstrained optimization algorithm and compare them with each other.
Keywords:unconstrained optimization  super memory gradient method  global convergence  
本文献已被 CNKI 维普 万方数据 等数据库收录!
点击此处可从《上海第二工业大学学报》浏览原始摘要信息
点击此处可从《上海第二工业大学学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号