基于独立注意力机制的图像检索算法  被引量:3

Image Retrieval Based on Independent Attention Mechanism

在线阅读下载全文

作  者:张舜尧 李华旺 张永合 王新宇 丁国鹏 ZHANG Shunyao;LI Huawang;ZHANG Yonghe;WANG Xinyu;DING Guopeng(Innovation Academy for Mircrosatellites of Chinese Academy of Sciences,Shanghai 201210,China;Shanghai Tech University,Shanghai 201210,China;University of Chinese Academy of Sciences,Beijing 100094,China)

机构地区:[1]中国科学院微小卫星创新研究院,上海201210 [2]上海科技大学,上海201210 [3]中国科学院大学,北京100094

出  处:《计算机科学》2023年第S01期318-323,共6页Computer Science

摘  要:近年来,深度学习的方法在基于内容的图像检索领域已经占据主导地位。为了改善主干网络提取出的特征,使得网络能计算出更具区分度的图像描述,提出了一种独立于输入特征的注意力模块ICSA(Independent Channel-wise and Spatial Attention)。该模块与其他的注意力机制的主要区别在于它的注意力权重在输入不同特征时保持一致,传统注意力模块通过对输入特征进行处理得到注意力,因此它的模型更为精简,其参数大小仅有6.7 kB,为SENet大小的5.2%和CBAM的2.6%,运行时间与SENet基本一致,为CBAM的14.9%。ICSA的注意力分为通道和空间注意力两部分,分别储存输入特征不同方向上的权重。在Pittsburgh数据集上进行实验,实验结果表明,对于不同的主干网络,在添加了ICSA模块后Recall@1有0.1%~2.4%的提升。In recent years,deep learning methods has taken a dominant position in the field of content-based image retrieval.To improve features extracted by off-the-shelf backbones and enable the network produce more discriminative image descriptors,the attention module ICSA(independent channel-wise and spatial attention),which is independent with features input into the mo-dule,is proposed.Attention weights of the proposed module keeps the same when input features change,while attention weights are usually computed with input features in other attention mechanisms,which is a main difference between ICSA and other attention modules.This feature also enables the module to be quite small(only 6.7kB,5.2%the size of SENet,2.6%of the size of CBAM)and relatively fast(similar with SENet in speed and 14.9%the time of CBAM).The attention of ICSA is divided as two parts:channel-wise and spatial attention,and they store the weights along orthogonal directions.Experiments on Pittsburgh shows that ICSA made improvement from 0.1%to 2.4%at Recall@1 when with different backbones.

关 键 词:基于内容的图像检索 注意力机制 特征增强 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象