首页 | 官方网站   微博 | 高级检索  
     


A Modified Three-Term Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization
Authors:Wujie Hu  Gonglin Yuan  Hongtruong Pham
Affiliation:1.Key Laboratory of Wireless Sensor Network & Communication, Shanghai Institute of Micro-System and Information Technology, Chinese Academy of Sciences, Shanghai, 201899, China. 2 Shanghai Research Center for Wireless Communication, Shanghai, 201210, China. 3 College of Physics and Electronic Information Engineering, Qinghai University for Nationalities, Xining, 810007, China.
Abstract:It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth problems. The perfect algorithm stems from concept of ‘bundle’ successfully addresses both smooth and nonsmooth complex problems, but it is regrettable that it is merely effective to small and medium optimization models since it needs to store and update relevant information of parameter’s bundle. The conjugate gradient algorithm is effective both large-scale smooth and nonsmooth optimization model since its simplicity that utilizes objective function’s information and the technique of Moreau-Yosida regularization. Thus, a modified three-term conjugate gradient algorithm was proposed, and it has a sufficiently descent property and a trust region character. At the same time, it possesses the global convergence under mild assumptions and numerical test proves it is efficient than similar optimization algorithms.
Keywords:Conjugate gradient  large-scale  trust region  global convergence  
点击此处可从《》浏览原始摘要信息
点击此处可从《》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号