检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]中原工学院,郑州450007 [2]河南工程学院,郑州451191
出 处:《科学技术与工程》2013年第35期10722-10726,共5页Science Technology and Engineering
基 金:国家自然科学基金项目(71173248)资助
摘 要:针对当前测距算法标定步骤复杂、图像匹配困难和不准确的问题,提出了一种远距离动态前景测距的方法,系统中由两个摄像头负责采集实时环境图像,利用前景检测算法检测出实时图像中的目标物,根据透视投影模型通过测量两个摄像头之间的距离、标定板与摄像头之间的距离、标定板自身的宽度和高度等数据,计算出目标物在标定板上映射点的三维坐标;再根据两个摄像头坐标,得到空间两条直线的方程,则两直线的交点就是目标物的物理坐标;最后利用欧氏距离公式,计算出目标物距离摄像头的距离。通过实验可知,在目标距离小于3 000 m时,两种方法测量误差均小于3%;当目标距离大于3 000 m时,计算机视觉动态前景测距和传统测距方法的测得的数据平均误差分别为2.90%和25.50%,计算机视觉动态前景测距方法表现出更高的精度。For the problem of complex calibration step, image matching and inaccurate in the current distance measurement algorithm, a long-range dynamic foreground distance measurement was proposed. The two cameras in the system were responsible for collecting real-time environment image and detecting the object by foreground detection algorithm. Three-dimensional coordinate of the object in the calibration mapping points on the board was worked out by measuring the distance between the two cameras, the distance between calibration and cameras, the width and height of calibration board according to perspective projection model. The two line formulas whose intersection point is the object was work out according to the two camera coordinates. The distance from the object to the camera was calculated by Euclidean distance formula. The experiments show that all the measured errors are less than 3 % when the target distance is less than 3 000 meters, the average measurement errors of computer vision dynamic foreground and traditional distance measurement methods are separately 2. 90% and 25.50% when the target distance is more than 3 000 meters, and the computer vision dynamic foreground distance measurement method shows a higher accuracy.
分 类 号:TP751.1[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.142