首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 639 毫秒
1.
In this paper, we propose a new unconstrained twin support vector regression model in the primal space (UPTSVR). With the addition of a regularization term in the formulation of the problem, the structural risk is minimized. The proposed formulation solves two smaller sized unconstrained minimization problems having continues, piece-wise quadratic objective functions by gradient based iterative methods. However, since their objective functions contain the non-smooth ‘plus’ function, two approaches are taken: (i) replace the non-smooth ‘plus’ function with their smooth approximate functions; (ii) apply a generalized derivative of the non-smooth ‘plus’ function. They lead to five algorithms whose pseudo-codes are also given. Experimental results obtained on a number of interesting synthetic and real-world benchmark datasets using these algorithms in comparison with the standard support vector regression (SVR) and twin SVR (TSVR) clearly demonstrates the effectiveness of the proposed method.  相似文献   

2.
Twin support vector regression (TSVR) was proposed recently as a novel regressor that tries to find a pair of nonparallel planes, i.e., ε-insensitive up- and down-bounds, by solving two related SVM-type problems. However, it may incur suboptimal solution since its objective function is positive semi-definite and the lack of complexity control. In order to address this shortcoming, we develop a novel SVR algorithm termed as smooth twin SVR (STSVR). The idea is to reformulate TSVR as a strongly convex problem, which results in unique global optimal solution for each subproblem. To solve the proposed optimization problem, we first adopt a smoothing technique to convert the original constrained quadratic programming problems into unconstrained minimization problems, and then use the well-known Newton–Armijo algorithm to solve the smooth TSVR. The effectiveness of the proposed method is demonstrated via experiments on synthetic and real-world benchmark datasets.  相似文献   

3.
Training twin support vector regression via linear programming   总被引:1,自引:0,他引:1  
This paper improves the recently proposed twin support vector regression (TSVR) by formulating it as a pair of linear programming problems instead of quadratic programming problems. The use of 1-norm distance in the linear programming TSVR as opposed to the square of the 2-norm in the quadratic programming TSVR leads to the better generalization performance and less computational time. The effectiveness of the enhanced method is demonstrated by experimental results on artificial and benchmark datasets.  相似文献   

4.
Twin support vector regression (TSVR) was proposed recently as a novel regressor that tries to find a pair of nonparallel planes, i.e. \(\epsilon \) -insensitive up- and down-bounds, by solving two related SVM-type problems. Though TSVR exhibits good performance compared with conventional methods like SVR, it suffers from the following issues: (1) it lacks model complexity control and thus may incur overfitting and suboptimal solution; (2) it needs to solve a pair of quadratic programming problems which are relatively complex to implement; (3) it is sensitive to outliers; and (4) its solution is not sparse. To address these problems, we propose in this paper a novel regression algorithm termed as robust and sparse twin support vector regression. The central idea is to reformulate TSVR as a convex problem by introducing regularization technique first and then derive a linear programming (LP) formulation which is not only simple but also allows robustness and sparseness. Instead of solving the resulting LP problem in the primal, we present a Newton algorithm with Armijo step-size to resolve the corresponding exact exterior penalty problem. The experimental results on several publicly available benchmark data sets show the feasibility and effectiveness of the proposed method.  相似文献   

5.
快速原空间孪生支持向量回归算法   总被引:1,自引:0,他引:1  
孪生支持向量回归(TSVR)通过快速优化一对较小规模的支持向量机问题获得回归函数。文中提出在原始输入空间中采用Newton法直接优化TSVR的目标函数,从而有效克服TSVR通过对偶二次规划问题求得近似最优解导致性能上的损失。数值模拟实验表明该方法不仅能提高TSVR的性能,并且可降低学习时间。  相似文献   

6.
Twin support vector regression (TSVR) finds ?-insensitive up- and down-bound functions by resolving a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one as in a classical SVR, which makes its computational speed greatly improved. However the local information among samples are not exploited in TSVR. To make full use of the knowledge of samples and improve the prediction accuracy, a K-nearest neighbor-based weighted TSVR (KNNWTSVR) is proposed in this paper, where the local information among samples are utilized. Specifically, a major weight is given to the training sample if it has more K-nearest neighbors. Otherwise a minor weight is given to it. Moreover, to further enhance the computational speed, successive overrelaxation approach is employed to resolve the QPPs. Experimental results on eight benchmark datasets and a real dataset demonstrate our weighted TSVR not only yields lower prediction error but also costs lower running time in comparison with other algorithms.  相似文献   

7.
基于支持向量回归的唇动参数预测   总被引:6,自引:1,他引:6  
支持向量机学习方法以结构风险最小化原则取代传统机器学习方法中的经验风险最小化原则,在有限样本的机器学习中显示出优异的性能.将这一新的统计学习方法应用到多媒体交互作用的研究中,用支持向量回归的方法由语音预测唇动参数.通过对语音的线性预测系数进行主分量分析,有效地压缩了声学特征参数的维数.结合交叉校验和最速下降优化方法,选择最佳的支持向量回归学习参数.在汉语0~9的任意数字串上对唇高参数的预测实验结果达到了均方误差0.0096,平均幅度误差7.2%及相关系数0.8的效果.这一结果优于一个文中优化过的人工神经网络所达到的性能,说明这一方法很有潜力.  相似文献   

8.

In this paper, a simple and linearly convergent Lagrangian support vector machine algorithm for the dual of the twin support vector regression (TSVR) is proposed. Though at the outset the algorithm requires inverse of matrices, it has been shown that they would be obtained by performing matrix subtraction of the identity matrix by a scalar multiple of inverse of a positive semi-definite matrix that arises in the original formulation of TSVR. The algorithm can be easily implemented and does not need any optimization packages. To demonstrate its effectiveness, experiments were performed on well-known synthetic and real-world datasets. Similar or better generalization performance of the proposed method in less training time in comparison with the standard and twin support vector regression methods clearly exhibits its suitability and applicability.

  相似文献   

9.
Twin support vector machine (TSVM), least squares TSVM (LSTSVM) and energy-based LSTSVM (ELS-TSVM) satisfy only empirical risk minimization principle. Moreover, the matrices in their formulations are always positive semi-definite. To overcome these problems, we propose in this paper a robust energy-based least squares twin support vector machine algorithm, called RELS-TSVM for short. Unlike TSVM, LSTSVM and ELS-TSVM, our RELS-TSVM maximizes the margin with a positive definite matrix formulation and implements the structural risk minimization principle which embodies the marrow of statistical learning theory. Furthermore, RELS-TSVM utilizes energy parameters to reduce the effect of noise and outliers. Experimental results on several synthetic and real-world benchmark datasets show that RELS-TSVM not only yields better classification performance but also has a lower training time compared to ELS-TSVM, LSPTSVM, LSTSVM, TBSVM and TSVM.  相似文献   

10.
加权稳健支撑向量回归方法   总被引:8,自引:0,他引:8  
张讲社  郭高 《计算机学报》2005,28(7):1171-1177
给出一类基于奇异值软剔除的加权稳健支撑向量回归方法(WRSVR).该方法的基本思想是首先由支撑向量回归方法(SVR)得到一个近似支撑向量回归函数,基于这个近似模型给出了加权SVR目标函数并利用高效的SVR求解技巧得到一个新的近似模型,然后再利用这个新的近似模型重新给出一个加权SVR目标函数并求解得到一个更为精确的近似模型,重复这一过程直至收敛.加权的目的是为了对奇异值进行软剔除.该方法具有思路简捷、稳健性强、容易实现等优点.实验表明,新算法WRSVR比标准SVR方法、稳健支撑向量网(RSVR)方法和加权最小二乘支撑向量机方法(WLS—SVM)更加稳健,算法的逼近精度受奇异值的影响远小于SVM、RSVR和WLS—SVM算法.  相似文献   

11.
Support Vector Echo-State Machine for Chaotic Time-Series Prediction   总被引:3,自引:0,他引:3  
A novel chaotic time-series prediction method based on support vector machines (SVMs) and echo-state mechanisms is proposed. The basic idea is replacing "kernel trick" with "reservoir trick" in dealing with nonlinearity, that is, performing linear support vector regression (SVR) in the high-dimension "reservoir" state space, and the solution benefits from the advantages from structural risk minimization principle, and we call it support vector echo-state machines (SVESMs). SVESMs belong to a special kind of recurrent neural networks (RNNs) with convex objective function, and their solution is global, optimal, and unique. SVESMs are especially efficient in dealing with real life nonlinear time series, and its generalization ability and robustness are obtained by regularization operator and robust loss function. The method is tested on the benchmark prediction problem of Mackey-Glass time series and applied to some real life time series such as monthly sunspots time series and runoff time series of the Yellow River, and the prediction results are promising  相似文献   

12.
Primal least squares twin support vector regression   总被引:1,自引:0,他引:1  
The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this solution is affected by time and memory constraints when dealing with large datasets.In this paper,we present a least squares version for TSVR in the primal space,termed primal least squares TSVR (PLSTSVR).By introducing the least squares method,the inequality constraints of TSVR are transformed into equality constraints.Furthermore,we attempt to directly solve the two QPPs with equality constraints in the primal space instead of the dual space;thus,we need only to solve two systems of linear equations instead of two QPPs.Experimental results on artificial and benchmark datasets show that PLSTSVR has comparable accuracy to TSVR but with considerably less computational time.We further investigate its validity in predicting the opening price of stock.  相似文献   

13.
In this paper, extreme learning machine (ELM) for ε-insensitive error loss function-based regression problem formulated in 2-norm as an unconstrained optimization problem in primal variables is proposed. Since the objective function of this unconstrained optimization problem is not twice differentiable, the popular generalized Hessian matrix and smoothing approaches are considered which lead to optimization problems whose solutions are determined using fast Newton–Armijo algorithm. The main advantage of the algorithm is that at each iteration, a system of linear equations is solved. By performing numerical experiments on a number of interesting synthetic and real-world datasets, the results of the proposed method are compared with that of ELM using additive and radial basis function hidden nodes and of support vector regression (SVR) using Gaussian kernel. Similar or better generalization performance of the proposed method on the test data in comparable computational time over ELM and SVR clearly illustrates its efficiency and applicability.  相似文献   

14.
建立在统计学习理论和结构风险最小化准则基础上的支持向量回归(SVR)是处理小样本数据回归问题的有利工具,SVR的参数选取直接影响其学习性能和泛化能力。文中将SVR参数选取看作是参数的组合优化问题,确定组合优化问题的目标函数,采用实数量子进化算法(RQEA)求解组合优化问题进而优选SVR参数,形成RQEA-SVR,并应用RQEA-SVR求解交通流预测问题。仿真试验表明RQEA是优选SVR参数的有效方法,解决交通流预测问题具有优良的性能。  相似文献   

15.
基于支持向量回归的非线性多功能传感器信号重构   总被引:1,自引:0,他引:1  
在多功能传感器信号重构中,通常采用经验风险最小化准则实现函数回归,在小样本情况下,该方法易导致泛化性差和过拟合问题.本文利用支持向量回归方法实现非线性多功能传感器信号重构,支持向量机是基于结构风险最小化准则的新型机器学习方法,可有效抑制过拟合问题并改善泛化性能.仿真结果表明经该算法重构后的信号重构误差率在0.4%以下,重构效果较好,验证了该算法的有效性.  相似文献   

16.
最小二乘支持向量机用于水量预测   总被引:1,自引:0,他引:1  
针对标准支持向量机建模时间长的缺点,为了城市用水量准确预测,需建立有效的预测模型.采用的最小二乘支持向量机基于结构风险最小化,并在支持向量机的基础上,将求解二次规划问题转化线性方程组,采用径向基核函数,使最小二乘支持向量机模型的待定参数比标准支持向量机少,可大大加快建模速度,同时还采用了人工免疫系统的自适应动态克隆选择算法,在寻优过程中能够准确、快速地搜索最小二乘支持向量机的最优参数.把上述模型用于城市日用水量预测,具有学习速度快.也具有良好的非线性建模和泛化能力,而且预测精度较高.  相似文献   

17.
The recently proposed twin support vector machine (TWSVM) obtains much faster training speed and comparable performance than classical support vector machine. However, it only considers the empirical risk minimization principle, which leads to poor generalization for real-world applications. In this paper, we formulate a robust minimum class variance twin support vector machine (RMCV-TWSVM). RMCV-TWSVM effectively overcomes the shortcoming in TWSVM by introducing a pair of uncertain class variance matrices in its objective functions. As a special case, we present a special type of the uncertain class variance matrices by combining the empirical positive and negative class variance matrices. Computational results on several synthetic as well as benchmark datasets indicate the significant advantages of proposed classifier in both computational time and test accuracy.  相似文献   

18.
ABSTRACT

Support vector machine (SVM) has proved to be a successful approach for machine learning. Two typical SVM models are the L1-loss model for support vector classification (SVC) and ε-L1-loss model for support vector regression (SVR). Due to the non-smoothness of the L1-loss function in the two models, most of the traditional approaches focus on solving the dual problem. In this paper, we propose an augmented Lagrangian method for the L1-loss model, which is designed to solve the primal problem. By tackling the non-smooth term in the model with Moreau–Yosida regularization and the proximal operator, the subproblem in augmented Lagrangian method reduces to a non-smooth linear system, which can be solved via the quadratically convergent semismooth Newton's method. Moreover, the high computational cost in semismooth Newton's method can be significantly reduced by exploring the sparse structure in the generalized Jacobian. Numerical results on various datasets in LIBLINEAR show that the proposed method is competitive with the most popular solvers in both speed and accuracy.  相似文献   

19.
基于支持向量回归机的区域物流需求预测模型及其应用*   总被引:3,自引:0,他引:3  
为了提高区域物流需求预测的能力,从区域经济等影响因素指标与区域物流需求之间的内在关系的角度,应用基于结构风险最小化准则的支持向量回归机(SVR)方法, 建立“影响因素—区域物流需求” SVR预测模型来研究预测区域物流需求问题。在选择适当的参数和核函数的基础上,对上海市物流需求量进行预测,发现该方法能获得较小的训练相对误差和测试相对误差。  相似文献   

20.
丁立中  贾磊  廖士中 《软件学报》2014,25(9):2149-2159
模型选择是支持向量学习的关键问题.已有模型选择方法采用嵌套的双层优化框架,内层执行支持向量学习,外层通过最小化泛化误差的估计进行模型选择.该框架过程复杂,计算效率低.简化传统的双层优化框架,提出一个支持向量学习的多参数同时调节方法,在同一优化过程中实现模型选择和学习器训练.首先,将支持向量学习中的参数和超参数合并为一个参数向量,利用序贯无约束极小化技术(sequential unconstrained minimization technique,简称SUMT)分别改写支持向量分类和回归的有约束优化问题,得到多参数同时调节模型的多元无约束形式定义;然后,证明多参数同时调节模型目标函数的局部Lipschitz连续性及水平集有界性.在此基础上,应用变尺度方法(variable metric method,简称VMM)设计并实现了多参数同时调节算法.进一步地,基于多参数同时调节模型的性质,证明了算法收敛性,对比分析了算法复杂性.最后,实验验证同时调节算法的收敛性,并实验对比同时调节算法的有效性.理论证明和实验分析表明,同时调节方法是一种坚实、高效的支持向量模型选择方法.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号