首页 | 官方网站   微博 | 高级检索  
     

融合振幅随机补偿与步长演变机制的改进原子搜索优化算法
引用本文:刘威,,,郭直清,,,刘光伟,靳宝,,,王东.融合振幅随机补偿与步长演变机制的改进原子搜索优化算法[J].智能系统学报,2022,17(3):602-616.
作者姓名:刘威      郭直清      刘光伟  靳宝      王东
作者单位:1. 辽宁工程技术大学 理学院,辽宁 阜新 123000;2. 辽宁工程技术大学 智能工程与数学研究院,辽宁 阜新 123000;3. 辽宁工程技术大学 数学与系统科学研究所,辽宁 阜新 123000;4. 辽宁工程技术大学 矿业学院,辽宁 阜新 123000
摘    要:针对原子优化算法寻优精度弱且易陷入局部极值的问题,本文从种群多样性、参数适应性和位置动态性角度提出一种融合混沌优化、振幅随机补偿和步长演变机制改进的原子搜索优化算法(improved atom search optimization, IASO),并将其成功应用于分类任务。首先,引入帐篷映射(Tent混沌)增强原子种群在搜索空间中的分布均匀性;其次,通过构建振幅函数对算法参数进行随机扰动并加入步长演变因子更新原子位置,以增强算法全局性和收敛性;最后,再将改进算法应用于误差反馈神经网络(BP神经网络)参数优化。通过与6种元启发式算法在20个基准测试函数下的数值实验对比表明:IASO不仅在求解多维基准函数上具有好的寻优性能,且在对BP神经网络参数进行优化时相较于2种对比算法具有更高的分类精度。

关 键 词:元启发式算法  原子搜索优化算法  Tent混沌优化  振幅随机补偿  步长演变机制  BP神经网络参数优化  分类  机器学习

Improved atom search optimization by combining amplitude random compensation and step size evolution mechanism
LIU Wei,,,GUO Zhiqing,,,LIU Guangwei,JIN Bao,,,WANG Dong.Improved atom search optimization by combining amplitude random compensation and step size evolution mechanism[J].CAAL Transactions on Intelligent Systems,2022,17(3):602-616.
Authors:LIU Wei      GUO Zhiqing      LIU Guangwei  JIN Bao      WANG Dong
Affiliation:1. College of Science, Liaoning Technical University, Fuxin 123000, China;2. Institute of Intelligent Engineering and Mathematics, Liao ning Technical University, Fuxin 123000, China;3. Institute of Intelligent Engineering and Methematics, Liaoning Technical University, Fuxin 123000, China;4. College of Mines, Liaoning Technical University, Fuxin 123000, China
Abstract:The weak optimization accuracy of the atom search optimization algorithm can easily fall into a local extremum owing to population diversity, parameter adaptability, and position dynamics. To overcome this challenge, we propose an improved ASO algorithm (IASO) by integrating chaos optimization, amplitude random compensation, and step size evolution mechanism and successfully apply it to classification tasks. First, tent chaos is introduced to enhance the distribution uniformity of atomic population in the search space. Then, an amplitude function is constructed to randomly perturb the algorithm parameters, and the step evolution factor is added to update the atomic position to enhance the globality and convergence of the algorithm. Finally, the improved algorithm is applied to the parameter optimization of the error feedback BP neural network. Compared with the numerical calculations of six metaheuristic algorithms under 20 benchmark functions, the experimental results indicate that IASO not only shows a good optimization performance in solving multidimensional benchmark functions but also a higher classification accuracy than two comparison algorithms in optimizing the BP neural network parameters.
Keywords:meta-heuristics algorithms  atom search optimization  tent chaos optimization  amplitude random compensation  step size evolution mechanism  parameter optimization of BP neural network  classification  machine learning
点击此处可从《智能系统学报》浏览原始摘要信息
点击此处可从《智能系统学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号