检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李豪 袁广林 秦晓燕 琚长瑞 朱虹 LI Hao;YUAN Guang-lin;QIN Xiao-yan;JU Chang-rui;ZHU Hong(Department of Information Engineering,Army Academy of Artillery and Air Defense of PLA,Hefei,Anhui 230031,China)
机构地区:[1]中国人民解放军陆军炮兵防空兵学院信息工程系,安徽合肥230031
出 处:《电子学报》2023年第1期105-116,共12页Acta Electronica Sinica
基 金:安徽省自然科学基金(No.2008085QF325)。
摘 要:近年来,目标跟踪中目标的状态表示已由粗糙的矩形框转化为精细的目标掩膜.然而,现有方法利用区域分割得到目标掩膜,速度慢并且掩膜精度受限于目标跟踪框.针对以上问题,本文提出基于空间加权对数似然比相关滤波与Deep Snake的目标轮廓跟踪方法 .该方法包括三个阶段:在第一阶段,利用提出的空间加权对数似然比相关滤波器估计目标的初始矩形框;在第二阶段,通过Deep Snake将初始矩形框变形为目标轮廓;在第三阶段,根据目标轮廓拟合出跟踪结果 .对提出的方法在OTB(Object Tracking Benchmark)-2015和VOT(Visual Object Tracking)-2018数据集上进行了实验验证,结果表明:与现有先进的目标跟踪方法相比,本文提出的跟踪方法具有较优的性能.Recently, the state representation of the target in object tracking has been transformed from the coarse bounding-box to fine-grained segmentation map. However, the existing methods use pixel-based segmentation to obtain object mask, which is slow and the accuracy of mask is limited by the object bounding box of tracking. To solve the above problems, we propose an object contour tracking method based on correlation filters with spatially-weighted logarithm likelihood ratio and deep snake. The method consists of three stages: at the first stage, the initial bounding box of the object is estimated by the proposed correlation filters with spatially-weighted logarithm likelihood ratio;at the second stage, the initial bounding box is deformed into the object contour via deep snake;at the third stage, the tracking results are fitted with the object contour. Experimental results on OTB(Object Tracking Benchmark)-2015 and VOT(Visual Object Tracking)-2018 datasets show that the proposed method is superior to the state-of-the-art approaches.
关 键 词:目标跟踪 深度主动轮廓 相关滤波 空间加权 对数似然比 视频目标分割
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7