一种基于GRU的增量学习算法  被引量:1

An incremental learning algorithm based on GRU

在线阅读下载全文

作  者:黄振峰[1] 王浩洋 HUANG Zhenfeng;WANG Haoyang(School of Mechanical Engineering,Guangxi University,Nanning 530004,China)

机构地区:[1]广西大学机械工程学院,广西南宁530004

出  处:《广西大学学报(自然科学版)》2023年第3期683-691,共9页Journal of Guangxi University(Natural Science Edition)

基  金:广西创新驱动发展科技重大专项(桂科AA17204074)。

摘  要:为了提高增量学习过程中模型的灵活性,本文提出一种基于门控循环单元(gate recurrent unit, GRU)的增量学习算法。为了保证模型的稳定性,提出共享主干特征提取网络的低级特征。基于GRU设计了特征融合模块和伪增量学习算法;特征融合模块能够融合神经网络为不同任务产生的高级特征,重建特征表示与分类器的对应关系,实现增量学习过程中神经网络的灵活性。为了验证算法稳定性,本文在CIFAR100、miniImageNet和ImageNet1000数据集上与其他增量学习算法进行了对比实验,分别实现了0.173 0、0.174 7与0.223 0的表现下降率,比最好的基准算法提升了0.028 9、0.059 4与0.011 6。为了验证算法的灵活性,本文在ImageNet1000数据集上实现了0.753 4的top1准确率,比最好的基准算法高出0.129 0。In order to improve the flexibility of the model in the process of incremental learning,this paper proposes an incremental learning algorithm based on gate recurrent unit(GRU).To ensure the stability of the model,a shared backbone feature extraction network is proposed for low-level features.A feature fusion module and a pseudo-incremental learning algorithm are designed based on GRU.The feature fusion module can fuse the advanced features generated by the neural network for different tasks,reconstruct the corresponding relationship between the feature representation and the classifier,Achieving neural network flexibility during incremental learning.In order to verify the stability of the algorithm,this paper conducted comparative experiments with other incremental learning algorithms on the CIFAR100,miniImageNet,and ImageNet1000 datasets,and achieved performance degradation rates of 0.1730,0.1747,and 0.2230,which are higher than the best benchmark algorithm.0.0289,0.0594 and 0.0116.To verify the flexibility of the algorithm,this paper achieves a top1 accuracy of 0.7534 on the ImageNet1000 dataset,which is 0.1290 higher than the best benchmark algorithm.

关 键 词:增量学习 灾难性遗忘 神经网络 特征融合 门控循环单元 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象