共查询到20条相似文献,搜索用时 15 毫秒
1.
A novel fuzzy compensation multi-class support vector machine 总被引:6,自引:0,他引:6
This paper presents a novel fuzzy compensation multi-class support vector machine (FCM-SVM) to improve the outlier and noise
sensitivity problem of traditional support vector machine (SVM) for multi-class data classification. The basic idea is to
give the dual effects to penalty term through treating every data point as both positive and negative classes, but with different
memberships. We fuzzify penalty term, compensate weight to classification, reconstruct the optimization problem and its restrictions,
reconstruct {Lagrangian} formula, and present the theoretic deduction. By this way the new fuzzy compensation multi-class
support vector machine is expected to have more generalization ability while preserving the merit of insensitive to outliers.
Experimental results on benchmark data set and real data set show that the proposed method reduces the effect of noise data
and yields higher classification rate than traditional multi-class SVM does. 相似文献
2.
Support vector machines (SVMs), initially proposed for two-class classification problems, have been very successful in pattern
recognition problems. For multi-class classification problems, the standard hyperplane-based SVMs are made by constructing
and combining several maximal-margin hyperplanes, and each class of data is confined into a certain area constructed by those
hyperplanes. Instead of using hyperplanes, hyperspheres that tightly enclosed the data of each class can be used. Since the
class-specific hyperspheres are constructed for each class separately, the spherical-structured SVMs can be used to deal with
the multi-class classification problem easily. In addition, the center and radius of the class-specific hypersphere characterize
the distribution of examples from that class, and may be useful for dealing with imbalance problems. In this paper, we incorporate
the concept of maximal margin into the spherical-structured SVMs. Besides, the proposed approach has the advantage of using
a new parameter on controlling the number of support vectors. Experimental results show that the proposed method performs
well on both artificial and benchmark datasets. 相似文献
3.
Chih-Cheng Chang Author Vitae Author Vitae Yuh-Jye Lee Author Vitae 《Pattern recognition》2011,44(6):1235-1244
This paper extends the previous work in smooth support vector machine (SSVM) from binary to k-class classification based on a single-machine approach and call it multi-class smooth SVM (MSSVM). This study implements MSSVM for a ternary classification problem and labels it as TSSVM. For the case k>3, this study proposes a one-vs.-one-vs.-rest (OOR) scheme that decomposes the problem into k(k−1)/2 ternary classification subproblems based on the assumption of ternary voting games. Thus, the k-class classification problem can be solved via a series of TSSVMs. The numerical experiments in this study compare the classification accuracy for TSSVM/OOR, one-vs.-one, one-vs.-rest schemes on nine UCI datasets. Results show that TSSVM/OOR outperforms the one-vs.-one and one-vs.-rest for all datasets. This study includes further error analyses to emphasize that the prediction confidence of OOR is significantly higher than the one-vs.-one scheme. Due to the nature of OOR design, it can detect the hidden (unknown) class directly. This study includes a “leave-one-class-out” experiment on the pendigits dataset to demonstrate the detection ability of the proposed OOR method for hidden classes. Results show that OOR performs significantly better than one-vs.-one and one-vs.-rest in the hidden-class detection rate. 相似文献
4.
Ji ZhengAuthor VitaeBao-Liang LuAuthor Vitae 《Neurocomputing》2011,74(11):1926-1935
In this paper, we propose a support vector machine with automatic confidence (SVMAC) for pattern classification. The main contributions of this work to learning machines are twofold. One is that we develop an algorithm for calculating the label confidence value of each training sample. Thus, the label confidence values of all of the training samples can be considered in training support vector machines. The other one is that we propose a method for incorporating the label confidence value of each training sample into learning and derive the corresponding quadratic programming problems. To demonstrate the effectiveness of the proposed SVMACs, a series of experiments are performed on three benchmarking pattern classification problems and a challenging gender classification problem. Experimental results show that the generalization performance of our SVMACs is superior to that of traditional SVMs. 相似文献
5.
The support vector machine (SVM) has a high generalisation ability to solve binary classification problems, but its extension to multi-class problems is still an ongoing research issue. Among the existing multi-class SVM methods, the one-against-one method is one of the most suitable methods for practical use. This paper presents a new multi-class SVM method that can reduce the number of hyperplanes of the one-against-one method and thus it returns fewer support vectors. The proposed algorithm works as follows. While producing the boundary of a class, no more hyperplanes are constructed if the discriminating hyperplanes of neighbouring classes happen to separate the rest of the classes. We present a large number of experiments that show that the training time of the proposed method is the least among the existing multi-class SVM methods. The experimental results also show that the testing time of the proposed method is less than that of the one-against-one method because of the reduction of hyperplanes and support vectors. The proposed method can resolve unclassifiable regions and alleviate the over-fitting problem in a much better way than the one-against-one method by reducing the number of hyperplanes. We also present a direct acyclic graph SVM (DAGSVM) based testing methodology that improves the testing time of the DAGSVM method. 相似文献
6.
7.
A corporate credit rating model using multi-class support vector machines with an ordinal pairwise partitioning approach 总被引:1,自引:0,他引:1
Kyoung-jae Kim 《Computers & Operations Research》2012,39(8):1800-1811
Predicting corporate credit-rating using statistical and artificial intelligence (AI) techniques has received considerable research attention in the literature. In recent years, multi-class support vector machines (MSVMs) have become a very appealing machine-learning approach due to their good performance. Until now, researchers have proposed a variety of techniques for adapting support vector machines (SVMs) to multi-class classification, since SVMs were originally devised for binary classification. However, most of them have only focused on classifying samples into nominal categories; thus, the unique characteristic of credit-rating - ordinality - seldom has been considered in the proposed approaches. This study proposes a new type of MSVM classifier (named OMSVM) that is designed to extend the binary SVMs by applying an ordinal pairwise partitioning (OPP) strategy. Our model can efficiently and effectively handle multiple ordinal classes. To validate OMSVM, we applied it to a real-world case of bond rating. We compared the results of our model with those of conventional MSVM approaches and other AI techniques including MDA, MLOGIT, CBR, and ANNs. The results showed that our proposed model improves the performance of classification in comparison to other typical multi-class classification techniques and uses fewer computational resources. 相似文献
8.
Shuji Kawano Dai Okumura Hiroki Tamura Hisasi Tanaka Koichi Tanno 《Artificial Life and Robotics》2009,13(2):483-487
Research surface electromyogram (s-EMG) signal recognition using neural networks is a method which identifies the relation
between s-EMG patterns. However, it is not sufficiently satisfying for the user because s-EMG signals change according to
muscle wasting or to changes in the electrode position, etc. A support vector machine (SVM) is one of the most powerful tools
for solving classification problems, but it does not have an online learning technique. In this article, we propose an online
learning method using SVM with a pairwise coupling technique for s-EMG recognition. We compared its performance with the original
SVM and a neural network. Simulation results showed that our proposed method is better than the original SVM.
This work was presented in part at the 13th International Symposium on Artificial Life and Robotics, Oita, Japan, January
31–February 2, 2008 相似文献
9.
John Shawe-TaylorAuthor VitaeShiliang SunAuthor Vitae 《Neurocomputing》2011,74(17):3609-3618
Support vector machines (SVMs) are theoretically well-justified machine learning techniques, which have also been successfully applied to many real-world domains. The use of optimization methodologies plays a central role in finding solutions of SVMs. This paper reviews representative and state-of-the-art techniques for optimizing the training of SVMs, especially SVMs for classification. The objective of this paper is to provide readers an overview of the basic elements and recent advances for training SVMs and enable them to develop and implement new optimization strategies for SVM-related research at their disposal. 相似文献
10.
This paper proposes a locality correlation preserving based support vector machine (LCPSVM) by combining the idea of margin maximization between classes and local correlation preservation of class data. It is a Support Vector Machine (SVM) like algorithm, which explicitly considers the locality correlation within each class in the margin and the penalty term of the optimization function. Canonical correlation analysis (CCA) is used to reveal the hidden correlations between two datasets, and a variant of correlation analysis model which implements locality preserving has been proposed by integrating local information into the objective function of CCA. Inspired by the idea used in canonical correlation analysis, we propose a locality correlation preserving within-class scatter matrix to replace the within-class scatter matrix in minimum class variance support machine (MCVSVM). This substitution has the property of keeping the locality correlation of data, and inherits the properties of SVM and other similar modified class of support vector machines. LCPSVM is discussed under linearly separable, small sample size and nonlinearly separable conditions, and experimental results on benchmark datasets demonstrate its effectiveness. 相似文献
11.
Shigeo Abe 《Pattern Analysis & Applications》2007,10(3):203-214
In this paper we discuss sparse least squares support vector machines (sparse LS SVMs) trained in the empirical feature space,
which is spanned by the mapped training data. First, we show that the kernel associated with the empirical feature space gives
the same value with that of the kernel associated with the feature space if one of the arguments of the kernels is mapped
into the empirical feature space by the mapping function associated with the feature space. Using this fact, we show that
training and testing of kernel-based methods can be done in the empirical feature space and that training of LS SVMs in the
empirical feature space results in solving a set of linear equations. We then derive the sparse LS SVMs restricting the linearly
independent training data in the empirical feature space by the Cholesky factorization. Support vectors correspond to the
selected training data and they do not change even if the value of the margin parameter is changed. Thus for linear kernels,
the number of support vectors is the number of input variables at most. By computer experiments we show that we can reduce
the number of support vectors without deteriorating the generalization ability.
Shigeo Abe received the B.S. degree in Electronics Engineering, the M.S. degree in Electrical Engineering, and the Dr. Eng. degree, all from Kyoto University, Kyoto, Japan in 1970, 1972, and 1984, respectively. After 25 years in the industry, he was appointed as full professor of Electrical Engineering, Kobe University in April 1997. He is now a professor of Graduate School of Science and Technology, Kobe University. His research interests include pattern classification and function approximation using neural networks, fuzzy systems, and support vector machines. He is the author of Neural Networks and Fuzzy Systems (Kluwer, 1996), Pattern Classification (Springer, 2001), and Support Vector Machines for Pattern Classification (Springer, 2005). Dr. Abe was awarded an outstanding paper prize from the Institute of Electrical Engineers of Japan in 1984 and 1995. He is a member of IEEE, INNS, and several Japanese Societies. 相似文献
Shigeo AbeEmail: |
Shigeo Abe received the B.S. degree in Electronics Engineering, the M.S. degree in Electrical Engineering, and the Dr. Eng. degree, all from Kyoto University, Kyoto, Japan in 1970, 1972, and 1984, respectively. After 25 years in the industry, he was appointed as full professor of Electrical Engineering, Kobe University in April 1997. He is now a professor of Graduate School of Science and Technology, Kobe University. His research interests include pattern classification and function approximation using neural networks, fuzzy systems, and support vector machines. He is the author of Neural Networks and Fuzzy Systems (Kluwer, 1996), Pattern Classification (Springer, 2001), and Support Vector Machines for Pattern Classification (Springer, 2005). Dr. Abe was awarded an outstanding paper prize from the Institute of Electrical Engineers of Japan in 1984 and 1995. He is a member of IEEE, INNS, and several Japanese Societies. 相似文献
12.
13.
14.
SVM has been receiving increasing interest in areas ranging from its original application in pattern recognition to other applications such as regression estimation due to its remarkable generalization performance. Unfortunately, SVM is currently considerably slower in test phase caused by number of the support vectors, which has been a serious limitation for some applications. To overcome this problem, we proposed an adaptive algorithm named feature vectors selection (FVS) to select the feature vectors from the support vector solutions, which is based on the vector correlation principle and greedy algorithm. Through the adaptive algorithm, the sparsity of solution is improved and the time cost in testing is reduced. To select the number of the feature vectors adaptively by the requirements, the generalization and complexity trade-off can be directly controlled. The computer simulations on regression estimation and pattern recognition show that FVS is a promising algorithm to simplify the solution for support vector machine. 相似文献
15.
V. Sugumaran G.R. Sabareesh K.I. Ramachandran 《Expert systems with applications》2008,34(4):3090-3098
Roller bearing is one of the most widely used rotary elements in a rotary machine. The roller bearing’s nature of vibration reveals its condition and the features that show the nature are to be extracted through some indirect means. Statistical parameters like kurtosis, standard deviation, maximum value, etc. form a set of features, which are widely used in fault diagnostics. Finding out good features that discriminate the different fault conditions of the bearing is often a problem. Selection of good features is an important phase in pattern recognition and requires detailed domain knowledge. This paper addresses the feature selection process using decision tree and uses kernel based neighborhood score multi-class support vector machine (MSVM) for classification. The vibration signal from a piezoelectric transducer is captured for the following conditions: good bearing, bearing with inner race fault, bearing with outer race fault, and inner and outer race faults. The statistical features are extracted therefrom and classified successfully using MSVM. The results of MSVM are compared with and binary support vector machine (SVM). 相似文献
16.
17.
18.
Jianhua Xu 《Expert systems with applications》2012,39(5):4796-4804
Existing multi-label support vector machine (Rank-SVM) has an extremely high computational complexity and lacks an intrinsic zero point to determine relevant labels. In this paper, we propose a novel support vector machine for multi-label classification through both simplifying Rank-SVM and adding a zero label, resulting into a quadratic programming problem in which each class has an independent equality constraint. When Frank-Wolfe method is used to solve our quadratic programming problem iteratively, our entire linear programming problem of each step is divided into a series of sub-problems, which dramatically reduces computational cost. It is illustrated that for famous Yeast data set our training procedure runs about 12 times faster than Rank-SVM does under C++ environment. Experiments from five benchmark data sets show that our method is a powerful candidate for multi-label classification, compared with five state-of-the-art multi-label classification techniques. 相似文献
19.
支持向量机(Support Vector Machine,SVM)作为一种经典的非线性分类器,用于模式识别,可以将训练样本从不可线性分类的低维空间映射到可线性分类的高维空间,再做分类,本文主要训练支持向量机使它学会区分人脸和非人脸。支持向量机的数学推导完备,算法逻辑严密,整体上比Adaboost算法复杂,但在样本量较少的情况下效果良好,因此有样本优势。支撑它的理论包含泛化性理论、最优化理论和核函数等,这些理论也被学术界广泛用于其他机器学习算法如神经网络,几十年来被证明具有很高的可靠性。同时本文论述主成分分析技术(PCA)用于压缩数据,实现数据降维,在数据预处理方面算法提供了很大帮助,使SVM支持向量机的输入数据维数大幅下降,大大提高了运算和检测时间。 相似文献