检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈献明 王阿川[1] 王春艳 CHEN Xian-ming;WANG A-chuan;WANG Chun-yan(College of Information and Computer Engineering,Northeast Forestry University,Harbin 150040,China;Zhalantun Vocational College,Zhalantun 162650,China)
机构地区:[1]东北林业大学信息与计算机工程学院,黑龙江哈尔滨150040 [2]扎兰屯职业学院,内蒙古扎兰屯162650
出 处:《液晶与显示》2019年第9期879-887,共9页Chinese Journal of Liquid Crystals and Displays
基 金:黑龙江省自然科学基金(No.C201414);哈尔滨市优秀学科带头人基金项目(No.2014RFXXJ040)~~
摘 要:针对木材活节、死节、虫眼等缺陷图像的检测问题,本文提出了一种基于深度学习的木材缺陷图像检测方法。首先,通过对Faster-RCNN网络进行训练,得到了可以对木材缺陷定位和识别的检测模型;然后,应用NL-Means方法对图像进行去噪,通过线性滤波、调整对比度和亮度实现图像增强;再对图像进行二值化处理,根据像素值差异提取缺陷边缘特征点集,实现了对木材缺陷的精细分割;最后,对椭圆拟合方法进行了改进,实现了对木材缺陷边缘点集的椭圆拟合,提供了新的木材缺陷加工方案。实验结果表明,该算法具有较好的木材缺陷定位和分类能力,得到了较好的分割及拟合效果,可在缺陷修补这一环节减少约10%的木材填充量。Aiming at the defect image detection of live knots,dead knots,wormhole and so on,a wood defect image detection method based on depth learning was proposed.Firstly,by training Faster-RCNN network,a detection model for locating and recognizing wood defects is obtained.Secondly,the image is denoised by NL-Means method,and image enhancement is achieved by linear filtering,adjusting contrast and brightness.Thirdly,the image is processed by binarization,and the edge feature points of defects are extracted according to the difference of pixel values to realize wood defects fine segmentation.Finally,the ellipse fitting method is improved to realize the ellipse fitting of wood defect edge point set,and a new wood defect processing scheme is provided.The experimental results show that the algorithm has better wood defect location and classification ability,and gets better segmentation and fitting effect.The filling volume of wood can be reduced by about 10%in the process of defect repair.
关 键 词:图像检测 深度学习 木材缺陷 边缘检测 椭圆拟合
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.44