检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:吴林涛 王文明[1] WU Lintao;WANG Wenming(School of Computer Science&Technology,Beijing Institute of Technology,Beijing 100081,China)
出 处:《实验技术与管理》2023年第12期82-91,共10页Experimental Technology and Management
基 金:国家重点研发计划项目(2020AAA0104903)。
摘 要:针对如何缓解跨模态行人重识别任务中行人模态之间的差异性问题,提出一种随机通道邻近数据增强方法RCNA和一种结合多维互信息的U型网络UMME。RCNA通过选取同类别的可见光图像和红外图像进行数据增强生成新的行人图像,既满足了真实数据分布,又融合了可见光图像的形状和结构信息以及红外图像的语义信息,缓解了可见光图像与红外图像之间的模态差异性。UMME通过互信息提取模块UMI提取同类别行人之间的互信息特征,再经过特征整合模块MSIF将互信息特征嵌入语义特征,增强了同类别行人之间语义特征的一致性。所提出的方法在数据集SYSU-MM01和RegDB上的Rank-1和mAP分别达到70.48%、68.34%和91.70%、88.42%,与现有研究方法相比,取得了优异的识别效果。To solve the problem of how to alleviate the difference between person modalities in cross-modality person re-identification task,a Random Channel Nearest Augment method(RCNA)and a U-shape Multiple Mutual-Information Embedding(UMME)network are proposed.RCNA selects visible and infrared images of the same category for data enhancement to generate new person images,which not only meets the real data distribution,but also integrates the shape and structure information of visible images and the semantic information of IR images to alleviate the modal difference between visible and infrared images.UMME extracts the mutual information features between person of the same category through the mutual information extraction module UMI,and then embeds the mutual information features into the semantic features through the feature integration module MSIF to enhance the semantic feature consistency between person of the same category.The proposed method achieves Rank-1/mAP of 70.48%/68.34%and 91.70%/88.42%on the dataset SYSU-MM01 and RegDB respectively.Compared with the existing research methods,the proposed method achieves excellent identification effect.
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.3