检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]西安交通大学电子与信息工程学院,西安710049
出 处:《西安交通大学学报》2008年第12期1476-1480,共5页Journal of Xi'an Jiaotong University
基 金:国家自然科学基金资助项目(60502021);教育部高等学校博士学科点专项科研基金资助项目(20050698025)
摘 要:针对立体匹配算法具有二义性,对遮挡、光照条件缺乏鲁棒性的问题,提出了一种两步多目重构新方法.首先,基于视觉外型的第二基本属性把立体匹配问题转化为颜色方差最小化问题,使用方差阈值来平衡采样点数与采样噪声之间的矛盾.然后,结合运动曲面的位置势能与弹性势能建立目标泛函,再采用二阶中心差分与Hamilton-Jacobi方程的迎风设计技术极小化该目标泛函,促使水平集函数逐步逼近目标表面,同时消除采样噪声,最终实现目标表面完全重构.新方法不仅避免了立体匹配算法固有的二义性问题,而且增强了对遮挡、复杂光照条件的鲁棒性.实验结果表明,新方法能够从实际复杂图像集推断出目标形状,与Hoppe的经典算法比较,表现出更好的准确性,且节省了26%~33%的重构时间.Focusing on the ambiguity in stereo matching, which induces the absence of robustness on occlusion and illumination condition, a novel multi-view reconstruction method is proposed based on a two-step strategy. The stereo matching problem is converted into an issue of color va- riance minimization using the second fundamental property of visual hull, and a threshold is employed to balance the contradiction between the sampling number and sampling noise. A cost functional is given by combining the position potential energy and the elasticity potential energy of the moving surface. The cost functional is minimized using second-order center difference and upwind design for Hamihon-Jacobi equation so that the object surface is approximated by level set functions, and the sampling noise is filtered. The surface of the object is then reconstructed completely. Not only the inherent ambiguity in stereo matching is avoided but also the robustness on occlusion and illumination condition is strengthened. The experimental results show that the shape of an object can be concluded from real complex images with high accuracy and robustness, and that the reconstructed time is saved about 26% to 33%, compared with the classical Hoppe's algorithm.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.249