基于权值变化的BP神经网络自适应学习率改进研究  被引量:19

Improvement of Learning Rate of Feed Forward Neural Network Based on Weight Gradient

在线阅读下载全文

作  者:朱振国[1] 田松禄 ZHU Zhen-Guo;TIAN Song-Lu(College of Information Science and Engineering, Chongqing Jiaotong University, Chongqing 400074, Chin)

机构地区:[1]重庆交通大学信息科学与工程学院,重庆400074

出  处:《计算机系统应用》2018年第7期205-210,共6页Computer Systems & Applications

摘  要:针对传统神经网络的学习率由人为经验性设定,存在学习率设置过大或过小,容易导致无法收敛或收敛速度慢的问题,本文提出基于权值变化的自适应学习率改进方法,改善传统神经网络学习率受人为经验因素影响的弊端,提高误差精度,并结合正态分布模型与梯度上升法,提高收敛速度.本文以BP神经网络为例,对比固定学习率的神经网络,应用经典XOR问题仿真验证,结果表明本文的改进神经网络具有更快的收敛速度和更小的误差.An adaptive learning rate improvement method,based on weight change,is proposed to improve the learning rate of traditional neural network in this study.If the learning rate is too large or too small,neural network is too difficult or too slow to converge.To offset this disadvantage,the study put forward a new learning rate,based on weight gradient,to improve the convergence rate and improve the traditional neural network learning rate affected by the human experienced factors,and combined with normal distribution and gradient rise method,to size up error accuracy and convergence speed.Taking BP neural network as an example,comparing the fixed learning rate neural network,and applying classical XOR problem simulation,we verify the proposed method.The results show that this improved neural network has faster convergence speed and smaller error.

关 键 词:神经网络 自适应学习率 正态分布模型 梯度上升法 XOR问题 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象