检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:蔡超[1] 丁明跃[1] 周成平[1] 张天序[1]
机构地区:[1]华中科技大学图象识别与人工智能研究所教育部信息处理与智能控制重点实验室,武汉430074
出 处:《中国图象图形学报(A辑)》2004年第2期134-138,共5页Journal of Image and Graphics
基 金:国家自然科学基金 ( 60 13 5 0 2 0 F F 0 3 0 40 5 );图象信息处理与智能控制教育部重点实验室开放基金 ( TKLJ0 0 10 )
摘 要:由于光照、遮挡以及投影等因素的影响 ,大量自然场景图像的边缘并不表现为简单的阶跃状边缘 ,而是几种简单边缘的复合。复合边缘的提取和准确定位是边缘提取中的一个难点。为此 ,应用图像复合边缘的数学模型 ,从一对分别为偶对称和奇对称的多小波尺度函数出发 ,构造出一种对复合边缘具有零系统定位误差的多小波边缘提取算子 ,相应的小波函数仍然分别为偶对称和奇对称的。理论上 ,应用这样的多小波边缘提取算子提取图像的复合边缘可以有任意高的边缘定位精度。实验结果表明 ,和常用的 Canny算子相比 ,应用本文所构造的多小波边缘提取算子提取的图像边缘失真较小 。In image that results from the projection of depth or orientation discontinuity in physical scene or from the effects of illumination and shading, edges are not in general ideal step edges but more typically a combination more than one type of edge primitive (e.g., steps, pulse and ramp etc). These edge points are termed as composite edges, and composite edges detection and localization accurately is a challenge for researcher in this field. In this paper, a composite edge detector based on multi-wavelet is proposed. Take the zero systematic localization error condition as constraint, the new multi-wavelet function is constructed in spatial domain, at the same time, the scale and wavelet functions are even or odd symmetric. Theoretically, based on this approach ones have the possibility of getting arbitrary precise localization for the edges that are composed of steps and pulse with wavelet transformation. Experimental results with simulated and real gray level images demonstrated the feasibility of our edge detector, it shows that the new multi-wavelet based edge detector has a better edge detection ability than Canny edge detector and Mallat-Zhong wavelet based edge detector.
关 键 词:边缘提取 多小波尺度函数 复合边缘 定位误差 图像处理
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117