检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:阳振宇 潘建平[1] 陈梦[2] YANG Zhenyu;PAN Jianping;CHEN Meng(Chongqing Jiaotong University,Chongqing 400074,China;Shandong University of Science and Technology,Qingdao 266590,China)
机构地区:[1]重庆交通大学,重庆400074 [2]山东科技大学,山东青岛266590
出 处:《测绘工程》2018年第8期60-65,共6页Engineering of Surveying and Mapping
基 金:重庆市国土房屋科技计划项目(KJ2015006)
摘 要:针对现有的迭代阈值分割算法作用于一些低对比度或一些变化较大的图像时精度不高、存在过度分割难以识别目标区域的问题,引入数学形态学模型,提出一种基于形态学中高低帽变换预处理后再进行迭代分割的改进优化算法。该算法利用高低帽变换来增大原始图像的灰度动态范围同时锐化图像,使图像清晰,再通过迭代阈值分割出目标区域,针对一些难以分割的目标,可以再次采用低帽变换凸显目标区域,最后通过二值图像连通区域标记,按面积擦除噪声区域完成分割。实验结果表明,不论是针对较小目标物体或是较大目标物体都能取得良好的分割效果,改进后算法的稳定性、适应性和分割精度都得到提升,具有广阔的应用前景。In view of the problem that the existing iterative threshold segmentation algorithm is not accurate in some low contrast or some changing images,and the existence of excessive segmentation is difficult to identify the target area,a mathematical morphology model is introduced to propose an improved optimization algorithm based on the high and low cap transformation preprocessing and then iterative segmentation.The algorithm uses high and low cap transform to increase the dynamic range of the original image and sharpen the image at the same time,make the image clear,and then divide the target area through the iterative threshold.In view of some inseparable targets,the target area can be highlighted by the low cap transformation.Finally,the area is marked by the area of two valued images and the area is based on the area.The segmentation is accomplished by erasing the noise area.The experimental result shows that both the smaller target object and the larger object can achieve good segmentation effect.The improved algorithm has improved stability,adaptability and segmentation precision,which has a broad application prospect.
分 类 号:TP751[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.171