首页 | 官方网站   微博 | 高级检索  
     

多输出神经元模型的MFNN带正则化因子RLS算法
引用本文:沈艳军,汪秉文,张林国.多输出神经元模型的MFNN带正则化因子RLS算法[J].计算机应用与软件,2005,22(11):102-104.
作者姓名:沈艳军  汪秉文  张林国
作者单位:三峡大学理学院,湖北,宜昌,443002;三峡大学理学院,湖北,宜昌,443002;三峡大学理学院,湖北,宜昌,443002
摘    要:在神经网络的学习中,将递推最小二乘算法(RLS)与正则化因子相结合,一方面,可以提高网络的泛化能力,另一方面,对学习样本的噪声具有鲁棒性。但是,当网络规模较大时,该算法每迭代一步计算复杂度和存储量要求很大。本文将带正则化因子的RLS算法应用于多输出神经元模型的多层前向神经网络,通过仿真实验,结果表明,本方法可以大大简化网络结构,减小每迭代一步计算的复杂度和存储量。

关 键 词:多输出神经元模型  正则化因子  递推最小二乘算法(RLS)  泛化能力
收稿时间:2003-12-02
修稿时间:2003-12-02

A REGULARIZER FOR RLS ALGORITHM IN MFNN WITH MULTIOUTPUT NEURAL MODEL
Shen Yanjun,Wang Bingwen,Zhang linguo.A REGULARIZER FOR RLS ALGORITHM IN MFNN WITH MULTIOUTPUT NEURAL MODEL[J].Computer Applications and Software,2005,22(11):102-104.
Authors:Shen Yanjun  Wang Bingwen  Zhang linguo
Affiliation:School of Science, Three Gorge University, Yichang Hubei 443002, China
Abstract:Recursive least squares(RLS)-based algorithms are a class of fast online training algorithms for feedforward multilayered networks(MFNN).Regularizer can improve the generalization of the trained networks.Used RLS methods together with the regularizer,the gene-(ralization) ability and convergent speed are improved.However,this algorithm achieves better performance at the expense of much greater computational and storage requirements.In this paper,RLS with regularizer is used for training MO-MFNN.By several simulations,it is proved that the modified methods can improve computational complexity and storage requirements and the generalization ability of the networks.
Keywords:Multi-output neural model Regularizer RLS algorithm Generalization
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号