DBAdam:一种具有动态边界的自适应梯度下降算法  

DBAdam:An Adaptive Gradient Descent Algorithm with Dynamic Bounds

在线阅读下载全文

作  者:张帅[1] 刘曜齐 姜志侠[1] ZHANG Shuai;LIU Yaoqi;JIANG Zhixia(School of Mathematics and Statistics,Changchun University of Science and Technology,Changchun 130022;School of Automotive Engineering,Jilin University,Changchun 130012)

机构地区:[1]长春理工大学数学与统计学院,长春130022 [2]吉林大学汽车工程学院,长春130012

出  处:《长春理工大学学报(自然科学版)》2024年第5期105-111,共7页Journal of Changchun University of Science and Technology(Natural Science Edition)

基  金:吉林省自然科学基金(YDZJ202201ZYTS519)。

摘  要:在神经网络中,梯度下降算法是优化网络权值阈值参数的核心部分,它在很大程度上影响着神经网络的性能。对于许多自适应算法,如AdaGrad、RMPprop、Adam等,虽然在训练前期收敛速度快,但它们的泛化性通常不如SGDM算法。为结合自适应算法和SGDM算法各自的优点,提出了DBAdam算法。通过利用梯度和学习率信息,构造了基于自适应学习率的动态上界函数和下界函数,将学习率约束在一个可控的范围内。这样使算法能够更好地适应不同参数的梯度变化,从而加快收敛速度。基于多种深度神经网络模型在三个基准数据集上对DBAdam算法进行实验,结果表明该算法的收敛性能较好。In the field of neural networks,gradient descent algorithms are the core component for optimizing the network weight parameters,which have a significant impact on the overall performance.Although many adaptive algorithms,such as AdaGrad,RMSProp,and Adam,tend to converge quickly during the pre-training phase,their generalization ability is often not as strong as that of the SGDM algorithm.To leverage the respective advantages of adaptive and SGDM algorithms,the DBAdam algorithm is proposed.DBAdam builded dynamic upper bound function based on the adaptive learning rate and lower bound function by using the gradient and learning rate information to constrain the learning rate within a controllable range,enabling the algorithm to better adapt to the gradient changes of different parameters to accelerate the convergence speed.The DBAdam algorithm has been evaluated on three benchmark datasets using a variety of deep neural network models,and the results demonstrate that the algorithm exhibits superior convergence performance.

关 键 词:神经网络 自适应算法 SGDM算法 收敛性 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象