检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]西安电子科技大学电子工程学院,陕西西安710071
出 处:《软件学报》2009年第5期1185-1193,共9页Journal of Software
基 金:国家自然科学基金No.60771068;国家重点基础研究发展计划(973)No.2006CB705700~~
摘 要:为了实现图像的完全分割,基于无须重新初始化的水平集方法提出了一种接力水平集方法.该方法在待分割图像中自动交替地创建嵌套子区域和相应的初始水平集函数,使水平集函数在其中演化并收敛,然后重复这个过程直到子区域面积为0.与原始算法及经典的基于区域的水平集方法相比,该方法具有如下优点:1)自动完成,无须交互式的初始化;2)多次分割图像,能够比原始算法检测到更多的边缘;3)对于非匀质的图像,能够取得比经典的基于区域的水平集方法更好的分割效果;4)提供一个开放的分割算法框架,其他单水平集方法稍作修改后也可替换这里所使用的单水平集方法.实验结果表明,此算法对人造图像和医学影像实现了无须交互的完全分割,对非匀质图像分割表现出更好的鲁棒性.Based on the level set method without re-initialization, a sequential level set method is proposed to realize full image segmentation. The proposed method automatically and alternatively creates nested sub-regions or the corresponding initial level set functions in the image to be segmented, and then makes the level set function evolved to be convergence in the corresponding sub-region. This step is sequentially repeated until the sub-region vanishes. Compared with the original method and a representative region-based level set method, the proposed method has many advantages as follows: 1) It is automatically executed and does not need the interactive initialization anymore; 2) It segments image more than once and detects more boundaries than the original method; 3) It can get better performance on non-homogenous images than the representative region-based level set method; 4) It is an open image segmentation framework in which the single level set method is used can be replaced by other single level set methods after some modification. Experimental results indicate that the proposed method could fully segment the synthetic and medical images without interactive step and additionally works more robust on non-homogenous images.
关 键 词:有限差分 几何主动轮廓 图像分割 水平集方法 偏微分方程
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28