首页 | 官方网站   微博 | 高级检索  
     

基于基分类器系数和多样性的改进AdaBoost算法
引用本文:朱亮,徐华,崔鑫.基于基分类器系数和多样性的改进AdaBoost算法[J].计算机应用,2021,41(8):2225-2231.
作者姓名:朱亮  徐华  崔鑫
作者单位:江南大学 人工智能与计算机学院, 江苏 无锡 214122
摘    要:针对传统AdaBoost算法的基分类器线性组合效率低以及过适应的问题,提出了一种基于基分类器系数与多样性的改进算法——WD AdaBoost。首先,根据基分类器的错误率与样本权重的分布状态,给出新的基分类器系数求解方法,以提高基分类器的组合效率;其次,在基分类器的选择策略上,WD AdaBoost算法引入双误度量以增加基分类器间的多样性。在五个来自不同实际应用领域的数据集上,与传统AdaBoost算法相比,CeffAda算法使用新的基分类器系数求解方法使测试误差平均降低了1.2个百分点;同时,WD AdaBoost算法与WLDF_Ada、AD_Ada、sk_AdaBoost等算法相对比,具有更低的错误率。实验结果表明,WD AdaBoost算法能够更高效地集成基分类器,抵抗过拟合,并可以提高分类性能。

关 键 词:权重  多样性  AdaBoost  双误度量  分类性能  
收稿时间:2020-10-12
修稿时间:2021-01-11

Improved AdaBoost algorithm based on base classifier coefficients and diversity
ZHU Liang,XU Hua,CUI Xin.Improved AdaBoost algorithm based on base classifier coefficients and diversity[J].journal of Computer Applications,2021,41(8):2225-2231.
Authors:ZHU Liang  XU Hua  CUI Xin
Affiliation:School of Artificial Intelligence and Computer Science, Jiangnan University, Wuxi Jiangsu 214122, China
Abstract:Aiming at the low efficiency of linear combination of base classifiers and over-adaptation of the traditional AdaBoost algorithm, an improved algorithm based on coefficients and diversity of base classifiers - WD AdaBoost (AdaBoost based on Weight and Double-fault measure) was proposed. Firstly, according to the error rates of the base classifiers and the distribution status of the sample weights, a new method to solve the base classifier coefficients was given to improve the combination efficiency of the base classifiers. Secondly, the double-fault measure was introduced into WD AdaBoost algorithm in the selection strategy of base classifiers for increasing the diversity among base classifiers. On five datasets of different actual application fields, compared with the traditional AdaBoost algorithm, CeffAda algorithm uses the new base classifier coefficient solution method to make the test error reduced by 1.2 percentage points on average; meanwhile, WD AdaBoost algorithm has the lower error rate compared with WLDF_Ada, AD_Ada (Adaptive to Detection AdaBoost), sk_AdaBoost and other algorithms. Experimental results show that WD AdaBoost algorithm can integrate base classifiers more efficiently, resist overfitting, and improve the classification performance.
Keywords:weight  diversity  AdaBoost  double-fault measure  classification performance  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机应用》浏览原始摘要信息
点击此处可从《计算机应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号