Convergence of BP Algorithm for Training MLP with Linear Output  

Convergence of BP Algorithm for Training MLP with Linear Output

在线阅读下载全文

作  者:Hongmei Shao Wei Wu Wenbin Liu 

机构地区:[1]College of Mathematics and Computational Science, China University of Petroleum, Dongying, 257061, China [2]Department of Applied Mathematics, Dalian University of Technology, Dalian, 116023, China [3]Institute of Computational Mathematics and Management Science, University of Kent, UK

出  处:《Numerical Mathematics A Journal of Chinese Universities(English Series)》2007年第3期193-202,共10页

基  金:This research was supported by the National Natural Science Foundation of China (10471017).

摘  要:The capability of multilayer perceptrons(MLPs)for approximating continuous functions with arbitrary accuracy has been demonstrated in the past decades.Back propagation(BP)algorithm is the most popular learning algorithm for training of MLPs.In this paper,a simple iteration formula is used to select the leaming rate for each cycle of training procedure,and a convergence result is presented for the BP algo- rithm for training MLP with a hidden layer and a linear output unit.The monotonicity of the error function is also guaranteed during the training iteration.The capability of multilayer perceptrons (MLPs) for approximating continuous functions with arbitrary accuracy has been demonstrated in the past decades. Back propagation (BP) algorithm is the most popular learning algorithm for training of MLPs. In this paper, a simple iteration formula is used to select the learning rate for each cycle of training procedure, and a convergence result is presented for the BP algorithm for training MLP with a hidden layer and a linear output unit. The monotonicity of the error function is also guaranteed during the training iteration.

关 键 词:多层感知器 BP算法 收敛性 单调性 神经网络 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象