检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:刘艺博 奚峥皓[1] LIU Yibo;XI Zhenghao(College of Electrical and Electronic Engineering,Shanghai University of Engineering Science,Shanghai 201620,China)
机构地区:[1]上海工程技术大学电子电气工程学院,上海201620
出 处:《计算机工程与应用》2023年第13期156-163,共8页Computer Engineering and Applications
摘 要:针对多目标跟踪领域中由目标信息关联性低引起的目标身份关联性差的问题,提出了一种基于关键点检测和关联的多目标跟踪算法。对目标的中心关键点建模,利用CenterNet对该点进行检测定位;将目标的深度特征与关键点尺度特征相结合,基于二者观测的显隐性关系构建一个联合特征提取器;将该联合特征作为目标的状态,通过隐马尔可夫模型估计下一帧的目标状态;利用目标的运动信息和关键点尺度信息提出“二级关联”的匹配机制,实现对该估计状态与检测目标的关联,得到最优的关联匹配结果。在公开的MOT17数据集上进行了仿真实验,并与一些主流算法进行了对比,结果表明,该算法在跟踪准确度指标表现较优,并对身份互换问题有较好的鲁棒性。Aiming at the problem of poor correlation of object identity caused by low correlation of object information in the multi-object tracking,this paper proposes a multi-object tracking algorithm based on key point detection and correlation.Firstly,it models the object by central key point,and uses CenterNet to detect and locate the point.Combining the depth features of the object with the scale features,it constructs a joint feature extractor based on the explicit and hidden relationship between their observations.Then taking the joint feature as the object state,and the object state of the next frame is estimated by hidden Markov model.Finally,it uses the object motion information and key point scale information to propose a“secondary correlation”matching mechanism,which is proposed to realize the correlation between the estimated state and the detected object,obtains the optimal correlation matching result.Experiments are carried out on the public MOT17 dataset,after compared with some mainstream algorithms,the results show that this algorithm performs better in tracking accuracy and has good robustness to the problem of identity exchange.
关 键 词:机器视觉 多目标跟踪 关键点检测 目标信息关联性
分 类 号:TP29[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28