检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
出 处:《图像与信号处理》2023年第2期200-209,共10页Journal of Image and Signal Processing
摘 要:随着小样本学习的发展,元学习已经成为一种流行的小样本学习框架,其作用是开发能够快速适应有限数据和低计算成本的小样本分类任务模型。最近有关注意力的研究已经证明了通道注意力对于特征提取效果有一定的提升,但是它忽略了位置信息的作用,位置信息对于在小样本任务中更好地从有限的数据中学习来说很重要。基于这一事实,本文提出一种新的方法,通过在所有基类上预先训练一个加入位置信息注意力的分类器,然后在基于最近质心的小样本分类算法上进行元学习,实现了将位置信息和提取特征有效的结合。通过在两个标准的数据集上实验,和当下主流的小样本图像分类方法相比,该方法在Mini-ImageNet数据集的1-shot与5-shot任务上分别提升1.23%和1.02%,在Tiered-ImageNet数据集上,也分别提升0.85%和0.78%。实验表明该方法有效的发挥了位置信息的作用,可以提升小样本图像分类的准确率。With the development of few-shot learning, meta-learning has become a popular few-shot learning framework, and its role is to develop models for few-shot classification tasks that can quickly adapt to limited data and low computational cost. Recent studies on attention have demonstrated that channel attention can improve feature extraction to a certain extent, but it ignores the role of location information, which is important for better learning with limited data in few-shot tasks. Based on this fact, this paper proposes a new method to effectively combine location information and extracted features by pre-training a classifier with location information attention on all base classes, and then performing meta-learning on few-shot classification algorithm based on the nearest centroid. Through experiments on two standard datasets, compared with the current mainstream few-shot image classification methods, this method was improved by 1.23% and 1.02% on the 1-shot and 5-shot tasks of the Mini-ImageNet dataset, and also by 0.85% and 0.78% on the Tiered-ImageNet dataset. Experiments show that the method effectively plays the role of location information and can improve the accuracy of few-shot image classification.
关 键 词:小样本学习 元学习 位置信息注意力 长期依赖关系 最近邻分类器
分 类 号:TP3[自动化与计算机技术—计算机科学与技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49