首页 | 官方网站   微博 | 高级检索  
     

基于激活函数四参可调的BP神经网络改进算法
引用本文:李恩玉,杨平先,孙兴波. 基于激活函数四参可调的BP神经网络改进算法[J]. 微电子学与计算机, 2008, 25(11)
作者姓名:李恩玉  杨平先  孙兴波
作者单位:四川理工学院,电子与信息工程系,四川,自贡,643000
基金项目:四川省教育厅资助项目,四川省重点实验室基金
摘    要:为改善BP神经网络的性能,以标准Sigmoidal函数为基础,提出了一种四参数可调的激活函数模型.在学习过程中,它能同时对激活函数的陡度、位置及映射范围进行调节,具有更强的非线性映射能力.并推导出其在BP神经网络中的学习算法.仿真结果显示,改进后的激活函数与传统的标准Sigmoidal函数相比,收敛速度能提高10倍以上,收敛精度误差可减小到传统误差的0.4%以下,而且可以有效地减少隐层的结点数,学习能力可得到较大的提高.

关 键 词:神经网络  BP算法  激活函数  可调参数

Improved Algorithm of BP Neural Networks Based on the Activation Function with Four Adjustable Parameters
LI En-yu,YANG Ping-xian,SUN Xing-bo. Improved Algorithm of BP Neural Networks Based on the Activation Function with Four Adjustable Parameters[J]. Microelectronics & Computer, 2008, 25(11)
Authors:LI En-yu  YANG Ping-xian  SUN Xing-bo
Abstract:In order to improve the function of the BP neural networks,a new activation function with four adjustable parameters based on the standard Sigmoidal function are put forward.In the learning process,it can adjust the steep degree,position and mapping scope simultaneously,so it has a stronger nonlinear mapping capabilities.Learning arithmetic of BP neural networks is also deduced.The simulation results show that compared to the traditional standard Sigmoidal function,the improved can increase the convergence speed more than 10 times and the convergence error can be less than the traditional error's 0.4%.It also can reduce the hidden layers' nodes effectively.Their learning ability can be improved greatly.
Keywords:neural network  BP algorithm  activation functions  adjustable parameters
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号