CONVERGENCE OF GRADIENT METHOD WITH MOMENTUM FOR BACK-PROPAGATION NEURAL NETWORKS  被引量:5

CONVERGENCE OF GRADIENT METHOD WITH MOMENTUM FOR BACK-PROPAGATION NEURAL NETWORKS

在线阅读下载全文

作  者:Wei Wu Naimin Zhang Zhengxue Li Long Li Yan Liu 

机构地区:[1]Department of Applied Mathematics, Dalian University of Technology, Dalian 116024, China [2]Mathematics and Information Science College, Wenzhou University, Wenzhou 325035, China [3]College of Information Science and Engineering, Dalian Institute of Light Industry, Dalian 116034, China

出  处:《Journal of Computational Mathematics》2008年第4期613-623,共11页计算数学(英文)

基  金:National Natural Science Foundation of China (10471017);Zhejiang Provincial Natural Science Foundation (Y606009)

摘  要:In this work, a gradient method with momentum for BP neural networks is considered. The momentum coefficient is chosen in an adaptive manner to accelerate and stabilize the learning procedure of the network weights. Corresponding convergence results are proved.In this work, a gradient method with momentum for BP neural networks is considered. The momentum coefficient is chosen in an adaptive manner to accelerate and stabilize the learning procedure of the network weights. Corresponding convergence results are proved.

关 键 词:Back-propagation (BP) neural networks Gradient method MOMENTUM Convergence. 

分 类 号:O241[理学—计算数学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象