首页 | 官方网站   微博 | 高级检索  
     

基于NSVM的核空间训练数据减少方法
引用本文:王晓,刘小芳.基于NSVM的核空间训练数据减少方法[J].电子科技大学学报(自然科学版),2013,42(4):592-596.
作者姓名:王晓  刘小芳
作者单位:1.四川理工学院计算机学院 四川 自贡 643000
基金项目:四川省教育厅重点项目(11ZA124);人工智能四川省重点实验室开放基金(2011RYJ02)
摘    要:针对核空间中大数据集的计算代价高问题, 提出用NSVM方法减少分类器的训练数据. 先用NSVM、核主成分分析(KPCA)和贪婪KPCA分别从全部训练数据中提取训练分类器的子集; 再用子集训练分类器, 并用训练和测试数据的错分率对分类结果进行评价. 在两个数据集和两种分类器中, 用KPCA提取的子集训练的分类器的分类性能弱于NSVM和贪婪KPCA, 但用贪婪KPCA提取的子集训练的分类器的泛化能力弱于NSVM. 仿真结果表明, 用NSVM方法提取的子集训练的分类器, 不仅保证了分类器的泛化能力, 也降低了分类算法的计算复杂度.

关 键 词:分类器    贪婪核主成分分析    核主成分分析    非线性支持向量机    支持向量    训练数据
收稿时间:2013-03-06

Nonlinear Support Vector Machine for Training Data Reduction in Kernel Space
Affiliation:1.School of Computer Science,Sichuan University of Science and Engineering Zigong Sichuan 643000
Abstract:Aiming at the high computational cost issue for large data sets in kernel space, the non-linear support vector machine (NSVM) is proposed to reduce training data of classifier. First, a subset of training classifier is extracted from full training data by using NSVM, kernel principal component analysis (KPCA), and greedy kernel principal component analysis (GKPCA), respectively. Then, the classifier is trained by those subsets, respectively. Finally, the classification results are evaluated by the error rate of the training and test data. The classification performance of the classifier trained by the subsets from the KPCA method is inferior to those of from the NSVM and the GKPCA methods, but the generalization of the classifier trained by the subset from the GKPCA method is inferior to those of from the NSVM method for two data sets through two the classifiers. Simulation results indicate that the classifier trained by the subset from the NSVM method not only ensures the generalization ability of classifier, but also reduces the computational complexity of the classification algorithm.
Keywords:
点击此处可从《电子科技大学学报(自然科学版)》浏览原始摘要信息
点击此处可从《电子科技大学学报(自然科学版)》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号