检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]吉林大学计算机科学与技术学院,吉林长春130012 [2]空军航空大学,吉林长春130022 [3]中国科学院长春光学精密机械与物理研究所,吉林长春130033
出 处:《光学精密工程》2014年第5期1379-1387,共9页Optics and Precision Engineering
基 金:吉林省科技发展计划资助项目(No.201105016)
摘 要:对基于双目立体视觉的一种三维重建系统进行了改进和扩展。将视差细化处理环节引入现有系统,使原视差及相邻视差的匹配代价拟合为一条二次曲线,并为该曲线重新寻找更加精确的视差。进一步将运动恢复计算环节引入系统,通过估计当前视角的摄像机运动矩阵和以跟踪点和摄像机运动矩阵为参数构造能量函数,对能量函数进行优化来有效缩小误差,恢复出准确的运动矩阵。实验结果表明:新增的视差细化处理环节有效提升了重建点云的精度,使细化前后三维重建结果误差平均减少了16.3%,避免了片状点云现象;新增的运动恢复优化环节,能够精确地恢复摄像机的运动矩阵,优化后三维重建结果平均重投影误差减少了95.5%;重构后不同视角的点云之间不再孤立,重建模型整体拼接自然。An 3Dreconstruction system based on binocular stereo vision was improved to obtain higher accuracy and larger scale of 3Dreconstruction.A disparity refinement procedure was introduced to the system,so that the matching cost between the original parallax and the adjacent disparity was fitted into a quadratic curve and to re-find a more accurate parallax for the curve.Then,the motion recovery calculation was further applies to estimation of the camera motion matrices and the tracking points and camera motion matrices were taken as the parameters to construct the energy function.Furthermore, the energy function was optimized to reduce errors effectively and reconstruct the motion matrix accu-rately.Experimental results indicate that the proposed refinement procedure effectively improves the accuracy of reconstruction of point clouds and reduces the error by 16.3percent in average with avoiding the scale point cloud phenomenon.As the motion recovery calculation steps recover the camera motion matrix accurately,the optimized method reduces the mean re-projection error by 95.5percent. The point clouds achieved from different angle images are no longer isolated and the reconstructed models are integrated naturally.
关 键 词:立体视觉 双目视觉 三维重建 视差细化 运动恢复
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222