检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:林挺强[1] 高峰[1] 唐沐恩[1] 文贡坚[1]
机构地区:[1]国防科学技术大学电子科学与工程学院ATR重点实验室,湖南长沙410073
出 处:《信号处理》2010年第12期1852-1857,共6页Journal of Signal Processing
基 金:自然科学基金项目(No.60872153)
摘 要:CV模型是一种重要的图像分割模型,本文针对其收敛速度慢、效率低的缺点提出一种求解CV模型的新方法。首先将CV模型的能量泛函改写成与原来有相同稳定解的总变分公式形式,然后使用对偶公式法求总变分公式的极小值,再在其中引入一速度项以加快模型的收敛速度。新方法一方面克服了梯度下降法要求时间步长小、迭代次数多的缺点,经过较少次的迭代就能收敛,减少了迭代计算的次数;另一方面,引入的速度项能够减少每次迭代的时间,从而缩短求解模型的时间。速度项的引入同时减少了对梯度的依赖,增强了抗噪性。另外,可以通过调节速度项得到不同数目的同质区域,以适应相同图像不同分割任务的需求。实验结果表明本文方法是有效的。The active contour model without edges(CV model) is one of the most successful variational models in image segmentation.The paper proposes a new method to improve the efficiency of CV model.Firstly,the energy function of CV model is substituted by the form of total variation which has the same stable solution with the original model.Secondly,the dual method is used to solve the minimal value of the total variation formulation.At the same time,a speed term is introduced to improve the convergence speed.The speed term can reduce the reliance on gradient and improve the robust.The new method can get different segmentation results from same image by changing speed term.The first experiment solves the partial differential equations by the gradient descent method and the new method respectively,it shows that the proposed method is not only faster than gradient descent method,but also robust to noise with more integrated segmentation result and more smooth edges.The second experiment solve the partial differential equations by the new method but the speed term is different,it shows that the convergence speed is lower and the segmentation result has more homogeneity regions when the speed term is small.The experiments show that the propose method is effective and potent.
分 类 号:TN911.7[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.116.36.48