首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A novel fuzzy compensation multi-class support vector machine   总被引:6,自引:0,他引:6  
This paper presents a novel fuzzy compensation multi-class support vector machine (FCM-SVM) to improve the outlier and noise sensitivity problem of traditional support vector machine (SVM) for multi-class data classification. The basic idea is to give the dual effects to penalty term through treating every data point as both positive and negative classes, but with different memberships. We fuzzify penalty term, compensate weight to classification, reconstruct the optimization problem and its restrictions, reconstruct {Lagrangian} formula, and present the theoretic deduction. By this way the new fuzzy compensation multi-class support vector machine is expected to have more generalization ability while preserving the merit of insensitive to outliers. Experimental results on benchmark data set and real data set show that the proposed method reduces the effect of noise data and yields higher classification rate than traditional multi-class SVM does.  相似文献   

2.
Support vector machines (SVMs), initially proposed for two-class classification problems, have been very successful in pattern recognition problems. For multi-class classification problems, the standard hyperplane-based SVMs are made by constructing and combining several maximal-margin hyperplanes, and each class of data is confined into a certain area constructed by those hyperplanes. Instead of using hyperplanes, hyperspheres that tightly enclosed the data of each class can be used. Since the class-specific hyperspheres are constructed for each class separately, the spherical-structured SVMs can be used to deal with the multi-class classification problem easily. In addition, the center and radius of the class-specific hypersphere characterize the distribution of examples from that class, and may be useful for dealing with imbalance problems. In this paper, we incorporate the concept of maximal margin into the spherical-structured SVMs. Besides, the proposed approach has the advantage of using a new parameter on controlling the number of support vectors. Experimental results show that the proposed method performs well on both artificial and benchmark datasets.  相似文献   

3.
This paper extends the previous work in smooth support vector machine (SSVM) from binary to k-class classification based on a single-machine approach and call it multi-class smooth SVM (MSSVM). This study implements MSSVM for a ternary classification problem and labels it as TSSVM. For the case k>3, this study proposes a one-vs.-one-vs.-rest (OOR) scheme that decomposes the problem into k(k−1)/2 ternary classification subproblems based on the assumption of ternary voting games. Thus, the k-class classification problem can be solved via a series of TSSVMs. The numerical experiments in this study compare the classification accuracy for TSSVM/OOR, one-vs.-one, one-vs.-rest schemes on nine UCI datasets. Results show that TSSVM/OOR outperforms the one-vs.-one and one-vs.-rest for all datasets. This study includes further error analyses to emphasize that the prediction confidence of OOR is significantly higher than the one-vs.-one scheme. Due to the nature of OOR design, it can detect the hidden (unknown) class directly. This study includes a “leave-one-class-out” experiment on the pendigits dataset to demonstrate the detection ability of the proposed OOR method for hidden classes. Results show that OOR performs significantly better than one-vs.-one and one-vs.-rest in the hidden-class detection rate.  相似文献   

4.
In this paper, we propose a support vector machine with automatic confidence (SVMAC) for pattern classification. The main contributions of this work to learning machines are twofold. One is that we develop an algorithm for calculating the label confidence value of each training sample. Thus, the label confidence values of all of the training samples can be considered in training support vector machines. The other one is that we propose a method for incorporating the label confidence value of each training sample into learning and derive the corresponding quadratic programming problems. To demonstrate the effectiveness of the proposed SVMACs, a series of experiments are performed on three benchmarking pattern classification problems and a challenging gender classification problem. Experimental results show that the generalization performance of our SVMACs is superior to that of traditional SVMs.  相似文献   

5.
The support vector machine (SVM) has a high generalisation ability to solve binary classification problems, but its extension to multi-class problems is still an ongoing research issue. Among the existing multi-class SVM methods, the one-against-one method is one of the most suitable methods for practical use. This paper presents a new multi-class SVM method that can reduce the number of hyperplanes of the one-against-one method and thus it returns fewer support vectors. The proposed algorithm works as follows. While producing the boundary of a class, no more hyperplanes are constructed if the discriminating hyperplanes of neighbouring classes happen to separate the rest of the classes. We present a large number of experiments that show that the training time of the proposed method is the least among the existing multi-class SVM methods. The experimental results also show that the testing time of the proposed method is less than that of the one-against-one method because of the reduction of hyperplanes and support vectors. The proposed method can resolve unclassifiable regions and alleviate the over-fitting problem in a much better way than the one-against-one method by reducing the number of hyperplanes. We also present a direct acyclic graph SVM (DAGSVM) based testing methodology that improves the testing time of the DAGSVM method.  相似文献   

6.
张钊  费一楠  宋麟  王锁柱 《计算机应用》2008,28(7):1681-1683
针对支持向量机理论中的多分类问题以及SVM对噪声数据的敏感性问题,提出了一种基于二叉树的模糊支持向量机多分类算法。该算法是在基于二叉树的支持向量机多分类算法的基础上引入模糊隶属度函数,根据每个样本数据对分类结果的不同影响,通过基于KNN的模糊隶属度的度量方法计算出相应的值,由此得到不同的惩罚值,这样在构造分类超平面时,就可以忽略对分类结果不重要的数据。通过实验证明,该算法有较好的抗干扰能力和分类效果。  相似文献   

7.
Predicting corporate credit-rating using statistical and artificial intelligence (AI) techniques has received considerable research attention in the literature. In recent years, multi-class support vector machines (MSVMs) have become a very appealing machine-learning approach due to their good performance. Until now, researchers have proposed a variety of techniques for adapting support vector machines (SVMs) to multi-class classification, since SVMs were originally devised for binary classification. However, most of them have only focused on classifying samples into nominal categories; thus, the unique characteristic of credit-rating - ordinality - seldom has been considered in the proposed approaches. This study proposes a new type of MSVM classifier (named OMSVM) that is designed to extend the binary SVMs by applying an ordinal pairwise partitioning (OPP) strategy. Our model can efficiently and effectively handle multiple ordinal classes. To validate OMSVM, we applied it to a real-world case of bond rating. We compared the results of our model with those of conventional MSVM approaches and other AI techniques including MDA, MLOGIT, CBR, and ANNs. The results showed that our proposed model improves the performance of classification in comparison to other typical multi-class classification techniques and uses fewer computational resources.  相似文献   

8.
Research surface electromyogram (s-EMG) signal recognition using neural networks is a method which identifies the relation between s-EMG patterns. However, it is not sufficiently satisfying for the user because s-EMG signals change according to muscle wasting or to changes in the electrode position, etc. A support vector machine (SVM) is one of the most powerful tools for solving classification problems, but it does not have an online learning technique. In this article, we propose an online learning method using SVM with a pairwise coupling technique for s-EMG recognition. We compared its performance with the original SVM and a neural network. Simulation results showed that our proposed method is better than the original SVM. This work was presented in part at the 13th International Symposium on Artificial Life and Robotics, Oita, Japan, January 31–February 2, 2008  相似文献   

9.
Support vector machines (SVMs) are theoretically well-justified machine learning techniques, which have also been successfully applied to many real-world domains. The use of optimization methodologies plays a central role in finding solutions of SVMs. This paper reviews representative and state-of-the-art techniques for optimizing the training of SVMs, especially SVMs for classification. The objective of this paper is to provide readers an overview of the basic elements and recent advances for training SVMs and enable them to develop and implement new optimization strategies for SVM-related research at their disposal.  相似文献   

10.
This paper proposes a locality correlation preserving based support vector machine (LCPSVM) by combining the idea of margin maximization between classes and local correlation preservation of class data. It is a Support Vector Machine (SVM) like algorithm, which explicitly considers the locality correlation within each class in the margin and the penalty term of the optimization function. Canonical correlation analysis (CCA) is used to reveal the hidden correlations between two datasets, and a variant of correlation analysis model which implements locality preserving has been proposed by integrating local information into the objective function of CCA. Inspired by the idea used in canonical correlation analysis, we propose a locality correlation preserving within-class scatter matrix to replace the within-class scatter matrix in minimum class variance support machine (MCVSVM). This substitution has the property of keeping the locality correlation of data, and inherits the properties of SVM and other similar modified class of support vector machines. LCPSVM is discussed under linearly separable, small sample size and nonlinearly separable conditions, and experimental results on benchmark datasets demonstrate its effectiveness.  相似文献   

11.
In this paper we discuss sparse least squares support vector machines (sparse LS SVMs) trained in the empirical feature space, which is spanned by the mapped training data. First, we show that the kernel associated with the empirical feature space gives the same value with that of the kernel associated with the feature space if one of the arguments of the kernels is mapped into the empirical feature space by the mapping function associated with the feature space. Using this fact, we show that training and testing of kernel-based methods can be done in the empirical feature space and that training of LS SVMs in the empirical feature space results in solving a set of linear equations. We then derive the sparse LS SVMs restricting the linearly independent training data in the empirical feature space by the Cholesky factorization. Support vectors correspond to the selected training data and they do not change even if the value of the margin parameter is changed. Thus for linear kernels, the number of support vectors is the number of input variables at most. By computer experiments we show that we can reduce the number of support vectors without deteriorating the generalization ability.
Shigeo AbeEmail:

Shigeo Abe   received the B.S. degree in Electronics Engineering, the M.S. degree in Electrical Engineering, and the Dr. Eng. degree, all from Kyoto University, Kyoto, Japan in 1970, 1972, and 1984, respectively. After 25 years in the industry, he was appointed as full professor of Electrical Engineering, Kobe University in April 1997. He is now a professor of Graduate School of Science and Technology, Kobe University. His research interests include pattern classification and function approximation using neural networks, fuzzy systems, and support vector machines. He is the author of Neural Networks and Fuzzy Systems (Kluwer, 1996), Pattern Classification (Springer, 2001), and Support Vector Machines for Pattern Classification (Springer, 2005). Dr. Abe was awarded an outstanding paper prize from the Institute of Electrical Engineers of Japan in 1984 and 1995. He is a member of IEEE, INNS, and several Japanese Societies.  相似文献   

12.
黄颖  李伟  刘发升 《计算机应用》2007,27(11):2821-2824
对现有的模糊支持向量机进行分析,提出一种改进的模糊支持向量机算法——双隶属度模糊支持向量机法(DM FSVM)。在传统的模糊支持向量机模型中,每一个训练样本的隶属函数中只有一个隶属度,而DM FSVM中每一个训练样本拥有两个隶属度。它既能保持传统模糊支持向量机的优点,又能充分利用有限样本,增加其分类推广能力。实验表明该算法较好地提高了分类精度。  相似文献   

13.
杨文柱  卢素魁  王思乐 《计算机应用》2011,31(12):3446-3448
提出一种基于多类支持向量机的棉花异性纤维分类方法,以期解决棉花异性纤维的在线分类难题。该方法首先对异性纤维目标图像进行颜色、形状和纹理特征提取,形成用于精确描述异性纤维目标的特征向量;然后分别构建3种不同体系结构的多类支持向量机用于棉花异性纤维的分类;最后采用交叉验证法对所构建的3种多类支持向量机进行测试。测试结果表明,基于有向无环图的一对一多类支持向量机在分类精度和分类速度上更适合用于棉花异性纤维在线分类。  相似文献   

14.
SVM has been receiving increasing interest in areas ranging from its original application in pattern recognition to other applications such as regression estimation due to its remarkable generalization performance. Unfortunately, SVM is currently considerably slower in test phase caused by number of the support vectors, which has been a serious limitation for some applications. To overcome this problem, we proposed an adaptive algorithm named feature vectors selection (FVS) to select the feature vectors from the support vector solutions, which is based on the vector correlation principle and greedy algorithm. Through the adaptive algorithm, the sparsity of solution is improved and the time cost in testing is reduced. To select the number of the feature vectors adaptively by the requirements, the generalization and complexity trade-off can be directly controlled. The computer simulations on regression estimation and pattern recognition show that FVS is a promising algorithm to simplify the solution for support vector machine.  相似文献   

15.
Roller bearing is one of the most widely used rotary elements in a rotary machine. The roller bearing’s nature of vibration reveals its condition and the features that show the nature are to be extracted through some indirect means. Statistical parameters like kurtosis, standard deviation, maximum value, etc. form a set of features, which are widely used in fault diagnostics. Finding out good features that discriminate the different fault conditions of the bearing is often a problem. Selection of good features is an important phase in pattern recognition and requires detailed domain knowledge. This paper addresses the feature selection process using decision tree and uses kernel based neighborhood score multi-class support vector machine (MSVM) for classification. The vibration signal from a piezoelectric transducer is captured for the following conditions: good bearing, bearing with inner race fault, bearing with outer race fault, and inner and outer race faults. The statistical features are extracted therefrom and classified successfully using MSVM. The results of MSVM are compared with and binary support vector machine (SVM).  相似文献   

16.
鉴于传统支持向量机分类过程的计算量和支持向量的个数成正比,为了提高分类决策的速度,提出一种约简支持向量的快速分类算法,该算法对原始的支持向量进行特定比例的模糊均值聚类操作,按照分类误差最小的原则构建最小线性二乘回归模型,求解新的支持向量系数和决策函数的偏置.人造数据集和标准数据集上的实验表明,约简50%支持向量后,可以在保持分类精度在无统计意义的明显损失的前提下,使得分类速度提高50%.  相似文献   

17.
18.
Existing multi-label support vector machine (Rank-SVM) has an extremely high computational complexity and lacks an intrinsic zero point to determine relevant labels. In this paper, we propose a novel support vector machine for multi-label classification through both simplifying Rank-SVM and adding a zero label, resulting into a quadratic programming problem in which each class has an independent equality constraint. When Frank-Wolfe method is used to solve our quadratic programming problem iteratively, our entire linear programming problem of each step is divided into a series of sub-problems, which dramatically reduces computational cost. It is illustrated that for famous Yeast data set our training procedure runs about 12 times faster than Rank-SVM does under C++ environment. Experiments from five benchmark data sets show that our method is a powerful candidate for multi-label classification, compared with five state-of-the-art multi-label classification techniques.  相似文献   

19.
支持向量机(Support Vector Machine,SVM)作为一种经典的非线性分类器,用于模式识别,可以将训练样本从不可线性分类的低维空间映射到可线性分类的高维空间,再做分类,本文主要训练支持向量机使它学会区分人脸和非人脸。支持向量机的数学推导完备,算法逻辑严密,整体上比Adaboost算法复杂,但在样本量较少的情况下效果良好,因此有样本优势。支撑它的理论包含泛化性理论、最优化理论和核函数等,这些理论也被学术界广泛用于其他机器学习算法如神经网络,几十年来被证明具有很高的可靠性。同时本文论述主成分分析技术(PCA)用于压缩数据,实现数据降维,在数据预处理方面算法提供了很大帮助,使SVM支持向量机的输入数据维数大幅下降,大大提高了运算和检测时间。  相似文献   

20.
基于最小二乘支持向量机变形,得到一个极其简单快速的分类器--直接支持向量机.与最小二乘支持向量机相比,该分类器只需直接求解一个更小规模矩阵的逆,大大减小了计算量,并未降低分类精度.从理论上证明了该矩阵可逆,保证了分类面存在的唯一性.对于线性情形,采用Sherman-Morrison-Woodbury公式降低可逆矩阵的维数,进一步减少了计算复杂度,使其可适用于更大规模的样本集.数值实验表明,新分类器可行并具有上述优势.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号