检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:翟良松[1] 姚莉秀[1] 杨杰[1] 王芳林[1] 刘瑞明[1] 刘洋[1]
机构地区:[1]上海交通大学图像处理与模式识别研究所,上海200240
出 处:《红外与激光工程》2008年第6期1101-1105,共5页Infrared and Laser Engineering
基 金:国家自然科学基金项目(60675023);国防973基金资助项目(51323020203-2)
摘 要:目标描述是目标跟踪算法中最重要的步骤之一,也是建立鲁棒视觉跟踪系统的关键。为了更好地描述目标,引入了基于关注度的显著性特征提取算法。关注度模型通过模仿人类的视觉机理,能自动从视野中突出最能吸引人类注意的部分。因此,基于关注度的目标描述可以较好地提取目标的最显著特征,从而有利于建立一个与背景具有更好区分性的目标模型。另外,为了融入空间信息,对目标的显著性图像进行了水平投影和垂直投影,并且把均值位移(Mean Shift)迭代过程直接应用到空间投影上,最终得到目标所在坐标位置。实验证明:这种基于关注度和空间投影的核跟踪算法比经典Mean Shift目标跟踪算法更加鲁棒、精确地跟踪复杂场景下的动态目标。同时,算法速度可以提高约50%。Target description has been considered to be one of the most crucial aspects for robust object tracking. To better describe the target, a novel mechanism, named visual attention, was proposed. Visual attention was a selecting mechanism that can imitate the visual system of human beings to find the most salient parts in the visual images. Visual attention based target model could describe the target feature more precisely, contribute to create a distinguishing target model. This simple but more effective target model was gained by highlighting the importance of the salient pixels of the target while suppressing the background and less salient pixels.Furthermore, a spatial projection of the target was also proposed to combine spatial information based on the salient map. The Mean Shift iteration procedure was introduced to localize the target based on spatial projection instead of color histogram. Experimental results show that the proposed visual attention based spatial projection (VABSP) tracking algorithm can track the object more robustly and precisely in clutter environment, compared with classical Mean Shift object tracking algorithm.And the kernel tracking algorithm can be 50% faster than classical algorithm.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15