首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 265 毫秒
1.
For real-world applications, the obtained data are always subject to noise or outliers. The learning mechanism of cerebellar model articulation controller (CMAC), a neurological model, is to imitate the cerebellum of human being. CMAC has an attractive property of learning speed in which a small subset addressed by the input space determines output instantaneously. For fuzzy cerebellar model articulation controller (FCMAC), the concept of fuzzy is incorporated into CMAC to improve the accuracy problem. However, the distributions of errors into the addressed hypercubes may cause unacceptable learning performance for input data with noise or outliers. For robust fuzzy cerebellar model articulation controller (RFCMAC), the robust learning of M-estimator can be embedded into FCMAC to degrade noise or outliers. Meanwhile, support vector machine (SVR) is a machine learning theory based algorithm which has been applied successfully to a number of regression problems when noise or outliers exist. Unfortunately, the practical application of SVR is limited to defining a set of parameters for obtaining admirable performance by the user. In this paper, a robust learning algorithm based on support SVR and RFCMAC is proposed. The proposed algorithm has both the advantage of SVR, the ability to avoid corruption effects, and the advantage of RFCMAC, the ability to obtain attractive properties of learning performance and to increase accurate approximation. Additionally, particle swarm optimization (PSO) is applied to obtain the best parameters setting for SVR. From simulation results, it shows that the proposed algorithm outperforms other algorithms.  相似文献   

2.
水质系统是一个开放的、复杂的、非线性动力学系统,具有时变复杂性,针对水质预测方法的研究虽然已经取得了一些成果,但也存在预测精度与计算复杂度等难题。为此,本文提出一种基于最小二乘支持向量回归的水质预测算法。支持向量机是机器学习中一种常用的分类模型,通过核函数将非线性数据从低维映射到高维空间,在高维空间实现线性分类和回归,最小二乘支持向量回归(LS-SVR)利用所有的样本参与回归拟合,使得回归的损失函数不再只与小部分支持向量样本有关,而是由所有样本参与学习修正误差,提高预测精度;同时该算法将标准SVR求解问题由不等式的约束条件及凸二次规划问题转化成线性方程组来求解,提高了运算速度,解决了非线性复杂特性的水质预测问题。  相似文献   

3.
支持向量机是一种基于统计学习理论的新颖的机器学习方法,该方法已经广泛用于解决分类与回归问题。标准的支持向量机算法需要解一个二次规划问题,当训练样本较多时,其运算速度一般很慢。为了提高运算速度,介绍了一种基于线性规划的支持向量回归算法,并由此提出几种新的回归模型,同时将它们应用到混沌时间序列预测中,并比较了它们的预测性能。在实际应用中,可以根据具体情况灵活地选择所需模型。  相似文献   

4.
基于尺度不变特征变换的特征包(BoF-SIFT)支持向量机的分类方法具有较好的手势识别效果, 但是计算复杂度高、实时性较差。为此, 提出了融合Hu矩与基于快速鲁棒特征的特征包(BoF-SURF)支持向量机(SVM)的手势识别方法。特征包模型中用快速鲁棒性特征(SURF)算法替换尺度不变特征变换(SIFT)算法提取特征, 提高了实时性, 并引入Hu矩描述手势全局特征, 进一步提高识别率。实验结果表明, 算法无论是实时性还是识别率都要高于BoF-SIFT支持向量机方法。  相似文献   

5.
一种高效的快速近似控制向量参数化方法   总被引:1,自引:0,他引:1  
控制向量参数化(Control vector parameterization, CVP) 方法是目前求解流程工业中最优操作问题的主流数值方法,然而,该方法的主要缺点之一是 计算效率较低,这是因为在求解生成的非线性规划(Nonlinear programming, NLP) 问题时,需要随着控制参数的调整,反复不断地求解相关的微分方程组,这也是CVP 方法中最耗时的部分.为了提高CVP 方法的计算效率,本文提出一种新颖的快速近似方法,能够有效减少微分方程组、函数值以及 梯度的计算量.最后,两个经典的最优控制问题上的测试结果及与国外成熟的最优控制 软件的比较研究表明:本文提出的快速近似CVP 方法在精度和效率上兼有良好的表现.  相似文献   

6.
A new optimal force distribution scheme of multiple cooperating robots is proposed, in which the duality theory of nonlinear programming (NLP) is combined with the quadratic programming (QP) approach. The optimal force distribution problem is formulated as a QP problem with both linear and quadratic constraints, and its solution is obtained by an efficient algorithm. The use of the quadratic constraints is important in that it considerably reduces the number of constraints, thus enabling the Dual method of NLP to be used in the solution algorithm. Moreover, it can treat norm constraints without approximation, such as bound of the norm of the force exerted by each robot. The proposed scheme is more efficient in terms of speed than any other method. Numerical examples of two PUMA robot task using the proposed method and a well-known fast method are compared, and the results indicate the capability of real time application of our method.  相似文献   

7.
The recently proposed reduced convex hull support vector regression (RH-SVR) treats support vector regression (SVR) as a classification problem in the dual feature space by introducing an epsilon-tube. In this paper, an efficient and robust adaptive normal direction support vector regression (AND-SVR) is developed by combining the geometric algorithm for support vector machine (SVM) classification. AND-SVR finds a better shift direction for training samples based on the normal direction of output function in the feature space compared with RH-SVR. Numerical examples on several artificial and UCI benchmark datasets with comparisons show that the proposed AND-SVR derives good generalization performance  相似文献   

8.
一个通用的混合非线性规划问题的演化算法   总被引:8,自引:0,他引:8  
提出了一种新的求解非线性规划问题的演化算法,它是在郭涛算法的基础上提出的,新算法的主要特点是引入了变维子空间,加入了子空间搜索过程和规范化约束条件以及增加了处理带等式约束的实数规划,整数规划,0-1规划和混合整数规划问题的功能,使之成为一种求解非线性规划(NLP)问题的通用算法,数值实验表明,新算法不仅是一种通用的算法,而且与已有算法的计算结果相比,其解的精确度也最好。  相似文献   

9.
Generalized Core Vector Machines   总被引:4,自引:0,他引:4  
Kernel methods, such as the support vector machine (SVM), are often formulated as quadratic programming (QP) problems. However, given$m$training patterns, a naive implementation of the QP solver takes$O(m^3)$training time and at least$O(m^2)$space. Hence, scaling up these QPs is a major stumbling block in applying kernel methods on very large data sets, and a replacement of the naive method for finding the QP solutions is highly desirable. Recently, by using approximation algorithms for the minimum enclosing ball (MEB) problem, we proposed the core vector machine (CVM) algorithm that is much faster and can handle much larger data sets than existing SVM implementations. However, the CVM can only be used with certain kernel functions and kernel methods. For example, the very popular support vector regression (SVR) cannot be used with the CVM. In this paper, we introduce the center-constrained MEB problem and subsequently extend the CVM algorithm. The generalized CVM algorithm can now be used with any linear/nonlinear kernel and can also be applied to kernel methods such as SVR and the ranking SVM. Moreover, like the original CVM, its asymptotic time complexity is again linear in$m$and its space complexity is independent of$m$. Experiments show that the generalized CVM has comparable performance with state-of-the-art SVM and SVR implementations, but is faster and produces fewer support vectors on very large data sets.  相似文献   

10.
To achieve robust estimation for noisy data set, a recursive outlier elimination-based least squares support vector machine (ROELS-SVM) algorithm is proposed in this paper. In this algorithm, statistical information from the error variables of least squares support vector machine is recursively learned and a criterion derived from robust linear regression is employed for outlier elimination. Besides, decremental learning technique is implemented in the recursive training–eliminating stage, which ensures that the outliers are eliminated with low computational cost. The proposed algorithm is compared with re-weighted least squares support vector machine on multiple data sets and the results demonstrate the remarkably robust performance of the ROELS-SVM.  相似文献   

11.
In this paper, extreme learning machine (ELM) for ε-insensitive error loss function-based regression problem formulated in 2-norm as an unconstrained optimization problem in primal variables is proposed. Since the objective function of this unconstrained optimization problem is not twice differentiable, the popular generalized Hessian matrix and smoothing approaches are considered which lead to optimization problems whose solutions are determined using fast Newton–Armijo algorithm. The main advantage of the algorithm is that at each iteration, a system of linear equations is solved. By performing numerical experiments on a number of interesting synthetic and real-world datasets, the results of the proposed method are compared with that of ELM using additive and radial basis function hidden nodes and of support vector regression (SVR) using Gaussian kernel. Similar or better generalization performance of the proposed method on the test data in comparable computational time over ELM and SVR clearly illustrates its efficiency and applicability.  相似文献   

12.

In this paper, we introduce a new algorithm for solving nonlinear programming (NLP) problems. It is an extension of Guo's algorithm [1] which possesses enhanced capabilities for solving NLP problems. These capabilities include: a) extending the variable subspace, b) adding a search process over subspaces and normalized constraints, c) using an adaptive penalty function, and d) adding the ability to deal with integer NLP problems, 0-1 NLP problems, and mixed-integer NLP problems which have equality constraints. These four enhancements increase the capabilities of the algorithm to solve nonlinear programming problems in a more robust and universal way. This paper will present results of numerical experiments which show that the new algorithm is not only more robust and universal than its competitors, but also its performance level is higher than any others in the literature.  相似文献   

13.
王晓明 《控制与决策》2010,25(4):556-561
基于支撑向量回归(SVR)可以通过构建支撑向量机分类问题实现的基本思想,推广最小类方差支撑向量机(MCVSVMs)于回归估计,提出了最小方差支撑向量回归(MVSVR)算法.该方法继承了MCVSVMs鲁棒性和泛化能力强的优点,分析了MVSVR和标准SVR之间的关系,讨论了在散度矩阵奇异情况下该方法的求解问题,同时也讨论了MVSVR的非线性情况.实验表明,该方法是可行的,且表现出了更强的泛化能力.  相似文献   

14.
基于支持向量回归的非线性多功能传感器信号重构   总被引:1,自引:0,他引:1  
在多功能传感器信号重构中,通常采用经验风险最小化准则实现函数回归,在小样本情况下,该方法易导致泛化性差和过拟合问题.本文利用支持向量回归方法实现非线性多功能传感器信号重构,支持向量机是基于结构风险最小化准则的新型机器学习方法,可有效抑制过拟合问题并改善泛化性能.仿真结果表明经该算法重构后的信号重构误差率在0.4%以下,重构效果较好,验证了该算法的有效性.  相似文献   

15.
一种基于支持向量机的图像边缘检测方法   总被引:3,自引:1,他引:3  
支持向量机是一种新的机器学习的方法。它以统计学习理论为基础,能够较好地解决小样本的学习问题。由于其出色的学习性能,该技术已成为当前国际机器学习界的研究热点。支持向量函数回归(SVR)是SVM的一个重要分支,它已经成功地应用于系统识别、非线性系统的预测等方面,并取得了较好的效果。文中通过图像的SVR表示,对SVR图像的边缘检测进行了研究。文中算例说明了该方法在实际应用中的可行性。实验结果表明,该算法能有效提高图像边缘检测效果。同时对其他边缘检测方法有一定的借鉴作用。  相似文献   

16.
Abstract: Using a conjugate gradient method, a novel iterative support vector machine (FISVM) is proposed, which is capable of generating a new non‐linear classifier. We attempt to solve a modified primal problem of proximal support vector machine (PSVM) and show that the solution of the modified primal problem reduces to solving just a system of linear equations as opposed to a quadratic programming problem in SVM. This algorithm not only has no requirement for special optimization solvers, such as linear or quadratic programming tools, but also guarantees fast convergence. The full algorithm merely needs four lines of MATLAB codes, which gives results that are similar to or better than that of several new learning algorithms, in terms of classification accuracy. Besides, the proposed stand‐alone approach is capable of dealing with instability of classification performance of smooth support vector machine, generalized proximal support vector machine, PSVM and reduced support vector machine. Experiments carried out on UCI datasets show the effectiveness of our approach.  相似文献   

17.
In this article, a feature selection algorithm for hyperspectral data based on a recursive support vector machine (R‐SVM) is proposed. The new algorithm follows the scheme of a state‐of‐the‐art feature selection algorithm, SVM recursive feature elimination or SVM‐RFE, and uses a new ranking criterion derived from the R‐SVM. Multiple SVMs are used to address the multiclass problem. The algorithm is applied to Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data to select the most informative bands and the resulting subsets of the bands are compared with SVM‐RFE using the accuracy of classification as the evaluation of the effectiveness of the feature selection. The experimental results for an agricultural case study indicate that the feature subset generated by the newly proposed algorithm is generally competitive with SVM‐RFE in terms of classification accuracy and is more robust in the presence of noise.  相似文献   

18.
加权稳健支撑向量回归方法   总被引:8,自引:0,他引:8  
张讲社  郭高 《计算机学报》2005,28(7):1171-1177
给出一类基于奇异值软剔除的加权稳健支撑向量回归方法(WRSVR).该方法的基本思想是首先由支撑向量回归方法(SVR)得到一个近似支撑向量回归函数,基于这个近似模型给出了加权SVR目标函数并利用高效的SVR求解技巧得到一个新的近似模型,然后再利用这个新的近似模型重新给出一个加权SVR目标函数并求解得到一个更为精确的近似模型,重复这一过程直至收敛.加权的目的是为了对奇异值进行软剔除.该方法具有思路简捷、稳健性强、容易实现等优点.实验表明,新算法WRSVR比标准SVR方法、稳健支撑向量网(RSVR)方法和加权最小二乘支撑向量机方法(WLS—SVM)更加稳健,算法的逼近精度受奇异值的影响远小于SVM、RSVR和WLS—SVM算法.  相似文献   

19.
In this paper, a Newton-conjugate gradient (CG) augmented Lagrangian method is proposed for solving the path constrained dynamic process optimization problems. The path constraints are simplified as a single final time constraint by using a novel constraint aggregation function. Then, a control vector parameterization (CVP) approach is applied to convert the constraints simplified dynamic optimization problem into a nonlinear programming (NLP) problem with inequality constraints. By constructing an augmented Lagrangian function, the inequality constraints are introduced into the augmented objective function, and a box constrained NLP problem is generated. Then, a linear search Newton-CG approach, also known as truncated Newton (TN) approach, is applied to solve the problem. By constructing the Hamiltonian functions of objective and constraint functions, two adjoint systems are generated to calculate the gradients which are needed in the process of NLP solution. Simulation examples demonstrate the effectiveness of the algorithm.  相似文献   

20.
求解非线性回归问题的Newton算法   总被引:1,自引:0,他引:1  
针对大规模非线性回归问题,提出基于静态储备池的Newton算法.利用储备池搭建高维特征空间,将原始问题转化成与储备池维数相关的线性支持向量回归问题,并应用Newton算法求解.鲁棒损失函数的应用可抑制异常点对预测结果的干扰.通过与SVR(Support Vector Regression)及储备池Tikhonov正则化方法比较,验证了所提方法的快速性、较高的预测精度和较好的鲁棒性.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号