检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]清华大学计算机科学与技术系,北京100084
出 处:《中国图象图形学报》2010年第1期109-115,共7页Journal of Image and Graphics
基 金:国家高技术研究发展计划(863)项目(2006AA01Z326);国家自然科学基金项目(60773144);北京市自然科学基金项目(4072015)
摘 要:Hough变换是数字图像处理和机器视觉领域的经典算法,主要用于直线或线段的检测。虽然某些广义Hough变换能够用于检测复杂的2维图形,但其通常都具有存储空间大、计算时间长、可靠性差等不足,而且对于需要使用导数或梯度信息的算法而言,往往对图像中的噪声比较敏感、鲁棒性差。为了对平面规则图形进行快速准确检测,在传统的直线Hough变换的基础上,结合平面规则图形的几何特征,提出了一种检测平面矩形和圆的快速通用方法。该算法首先对图像进行滤波处理,并用Canny算子做边缘检测,先得到闭合的轮廓曲线,再利用形状角Dα对轮廓曲线进行粗分类;然后分门别类进行细致、准确的图形识别。该检测方法,由于只需要进行简单的1维和2维(直线)投票,而且完全不需要任何导数信息,从而大大提高了圆检测的速度和鲁棒性。实验表明,该检测方法适用于各种常见平面规则图形的检测和识别,并且检测速度快、精度高。Hough transform (HT) is a typical algorithm used for the detection of lines or line segments. The generalized HT can be used for detection of complex 2D polygons, but it needs voting in 3 or more dimensions. This results in the need of large memory space and long computing time. And some of its variations, in which the derivative/gradient information of the image is needed, are sensitive to noise in images, so that the robustness of the methods decreases. This paper presents a unified approach based on Hough Transform for quick detection of planar rectangles and circles. The presented approach introduces a geometrical invariunt - the Shape Angle D , and makes good use of geometric properties of polygons for roughly classifying shapes of closed edges that are detected by Canny detector. It does not need any derivative/gradient information of images but simple computation and only 1D or 2D votes to improve the robustness and speed up the computation of the algorithm presented. Finally, it is shown by the experiments that this approach can be used for detecting various planar regular polygons, and it is not only quick but also accurate.
关 键 词:图像识别Hough变换 平面规则图形 几何特征
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:13.58.229.23