检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
出 处:《光学精密工程》2011年第3期703-708,共6页Optics and Precision Engineering
基 金:国家863高技术研究发展计划资助项目(No.2006AA040307)
摘 要:针对传统的分块跟踪算法计算量大,难以实时地对运动目标进行跟踪这一问题,提出了一种改进的分块跟踪算法。首先,为降低背景噪声对跟踪性能产生的不利影响,提出的算法对目标所在矩形窗口进行了更细致的划分;然后,根据目标运动信息确定搜索范围和搜索中心,采用分层次的自适应搜索算法,在每一层采用不同的搜索策略逐步逼近与目标模板最相似的位置,避免算法将时间过多地浪费在无效位置的运算上;最后,给出了改进算法在DSP上的实现和优化方法。实验结果显示,该改进算法能够在DM642上以30 frame/s的速度处理768 pixel×576 pixel的图像,与传统的分块跟踪算法相比,提高了跟踪精度,运算时间减小了约47.5%。该改进算法较好地解决了传统分块算法的缺陷,实现了在嵌入式系统上对运动目标的实时跟踪。For the large computation and the real time tracking to be hard to achieve by traditional fragment based tracking algorithm, an improved fragment based algorithm was proposed. Firstly, in order to reduce the negative effect yielded by the background noise, the region of the tracked object was divided into more fragments. Then, the position and the range of the search region were identified according to the movement information of the object. By utilizing a fast hierarchical adaptive search approach which adopts different search patterns in different steps, most of the calculations for invalid positions were skipped, and the coordinate where the candidate was most similar with the object template was obtained quickly. Moreover, the improved algorithm was implemented and optimized on a DSP. Experimental results indicate that the improved algorithm can process the image of 768 pixel × 576 pixel on DM642 at a processing speed of 30 frame/s. Compared with the traditional fragment based tracking algorithm, it shows a more precise tracking and saves the processing time about 47.5%. These results show that the improved algorithm overcome the shortcomes from traditional fragment based tracking algorithms, and can achieve a real time tracking with better tracking performance.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.171