检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李召龙[1] 沈同圣[2] 娄树理[1] LI Zhao-long SHEN Tong-sheng LOU Shu-li(Department of Control Engineering, Navy Aeronautieal Engineering University,Yantai 264001, China National Defense Seienee and Technology Information Center, Beijing 100142 ,China)
机构地区:[1]海军航空工程学院控制工程系,山东烟台264001 [2]中国国防科技信息中心,北京100142
出 处:《激光与红外》2017年第1期119-123,共5页Laser & Infrared
基 金:国家自然科学基金项目(No.61303192)资助
摘 要:通过计算光流场来检测场景中的运动目标是计算机视觉中非常重要的研究课题,而光流场计算的精度直接关系到目标检测的准确性。针对实际拍摄的视频中由于背景存在运动而导致光流场中运动目标不突出的情况,提出了一种基于分块积分投影配准算法的光流场计算方法。首先利用提出的分块积分投影配准算法得到图像背景的运动参数,然后对背景进行运动补偿,再利用L-K算法求取运动补偿后图像中有效区域的光流场。通过真实视频对算法进行验证,并将结果与经典的L-K算法结果进行了对比。对比结果显示:本文所提算法计算得到的光流场中运动目标更加突出,算法效果较好。The detection of moving object in the scene through calculating the optical flow field is a very important re- search point in computer vision. The computation accuracy of optical flow field is directly related to the accuracy of target detection. In view of the fact that the optical flow field of motion target is not prominent in the whole optical flow field due to the dynamic background, a new method based on blocked integral projection registration algorithm is proposed. Firstly, the motion parameters of the image background are obtained by the proposed blocked projection-based registration algorithm. Then the motion compensation of the background is carried out. After that, the optical flow field in the effective area of the image is obtained by using the traditional L-K algorithm. The algorithm is verified by a real video sequence, and the results compared with the results of the classical L-K algorithm. The comparison resuits show that the moving targets in the optical flow field obtained using the proposed algorithm is more prominent, and the algorithm is effective.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222