首页 | 官方网站   微博 | 高级检索  
     


Optimizing the Garch Model–An Application of Two Global and Two Local Search Methods
Authors:Kwami Adanu
Affiliation:(1) Michigan State University, East Lansing, USA
Abstract:Results from our optimization exercise clearly show the advantage of using the random search algorithms when we anticipate the search for the global optimum to be difficult. When the number of parameters in the model is relatively small (nine parameters) Differential Evolution performs better than Genetic Algorithm. However, when the number of parameters in the model is relatively large (fifteen parameters) the reverse case is true. A comparison of the Quasi-Newton and Simplex methods also shows that both the Quasi-Newton algorithm of shazam and the simplex algorithm of fminsearch are sensitive to starting values. However, allowing shazam to set its starting values or using the PRESAMP option to set the starting values produced the best results for shazam. The general conclusion of this paper is that the choice of optimization technique for difficult optimization problems like the one attempted here should be based on problem attributes. When in doubt, multiple techniques should be applied and the estimated results evaluated.
Keywords:GARCH  global optimum  genetic algorithm  differential evolution  Quasi-Newton algorithm  Simplex method
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号