检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]浙江中医药大学信息技术学院,浙江杭州310053 [2]浙江大学流体传动与控制国家重点实验室,浙江杭州310027
出 处:《浙江大学学报(工学版)》2012年第11期2061-2067,共7页Journal of Zhejiang University:Engineering Science
基 金:浙江省自然科学基金资助项目(LQ12F01004)
摘 要:针对复杂背景下绝大多数三维重建算法的可靠性不高和鲁棒性不强的问题,提出一种适合复杂背景图像的三维重建算法.研究空间点三维重建计算的一般性框架,并分析具体的实现过程;为了提高复杂背景下三维重建的可靠性和鲁棒性,对初始视差图中的无纹理或低纹理区域、遮挡区域和深度不连续区域分别运用置信滤波器、左右一致性滤波器和唯一性滤波器进行预处理,剔除伪匹配点;将滤波后的致密视差图进行空间点的三维重建,得到复杂背景图像点的深度信息.实验结果表明,该算法计算效率高,具有很强的稳健性,即使图像背景条件发生动态变化也能够获得满意的三维重建精度,能够为计算机辅助外科诊断系统提供可靠的深度信息.Aiming at the problem that the vast majority of three-dimensional reconstruction algorithms under complex background have low reliability and weak robustness,a three-dimensional reconstruction algorithm suitable for complex background is proposed.General framework in three-dimensional reconstruction calculation of spatial points was investigated,and specific implementation procedures were analyzed.To improve the reliability and robustness of three-dimensional reconstruction under complex background,the confidence filter,left-right filter and uniqueness filter were adopted to eliminate the false matching points for the textureless regions or low texture regions,occluded regions and depth discontinuity regions in initial disparity map,respectively.Spatial points' three-dimensional reconstruction of the filtered dense disparity map was implemented,the depth information of the image points under complex background was obtained.The experiment results show that the proposed method is efficient and has strong robustness,satisfactory three-dimensional reconstruction accuracy also can be obtained even if the image background conditions change dynamically and it can provide reliable depth information for computer assisted surgery diagnostic system.
分 类 号:TP751.1[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28