一种基于误差放大的快速BP学习算法  被引量:10

An Algorithm for Fast Convergence of Back Propagation by Enlarging Error

在线阅读下载全文

作  者:杨博[1] 王亚东[1] 苏小红[1] 

机构地区:[1]哈尔滨工业大学计算机学院,哈尔滨150001

出  处:《计算机研究与发展》2004年第5期774-779,共6页Journal of Computer Research and Development

基  金:国家自然科学基金项目 (69975 0 0 5;60 2 73 0 83 )

摘  要:针对目前使用梯度下降原则的BP学习算法 ,受饱和区域影响容易出现收敛速度趋缓的问题 ,提出一种新的基于误差放大的快速BP学习算法以消除饱和区域对后期训练的影响 该算法通过对权值修正函数中误差项的自适应放大 ,使权值的修正过程不会因饱和区域的影响而趋于停滞 ,从而使BP学习算法能很快地收敛到期望的精度值 对 3 par ity问题和Soybean分类问题的仿真实验表明 ,与目前常用的Delta bar Delta方法、加入动量项方法、PrimeOffset等方法相比 。A back propagation neural network based on enlarging error is proposed for improving the learning speed of multi layer artificial neural networks with sigmoid activation function It deals with the flat spots that play a significant role in the slow convergence of back propagation (BP) The advantages of the proposed algorithm are that it can be established easily and convergent with minimal mean square error It updates the weights of neural network effectively by enlarging the error term of each output unit, and keeps high learning rate to meet the convergence criteria quickly The experiments based on the well established benchmarks, such as 3 parity and soybean data sets, show that the algorithm is more efficacious and powerful than some of the existing algorithms such as Delta bar Delta algorithm, momentum algorithm, and Prime Offset algorithm in learning, and it is less computationally intensive and less required memory than the Levenberg Marquardt(LM) method

关 键 词:反向传播 多层人工神经网络 误差放大 饱和区域 奇偶问题 Soybean数据集 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象