检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张青[1] 邹湘军[2] 林桂潮[1] 孙艳辉[1] Zhang Qing;Zou Xiangfun;Lin Guichao;Sun Yanhui(Heat-sensitive Materials Processing Engineering Technology Research Center ofAnhui,Chuzhou University,Chuzhou 239000,China;Key Lab of Key Technology on South Agricultural Machine and Equipment Ministry of Education,South China Agricultural University,Guangzhnu 510642,China)
机构地区:[1]滁州学院安徽省热敏性物料加工工程技术研究中心,安徽滁州239000 [2]华南农业大学南方农业机械与装备关键技术教育部重点实验室,广东广州510642
出 处:《系统仿真学报》2019年第1期7-15,共9页Journal of System Simulation
基 金:国家自然科学基金(31571568);安徽省热敏性物料加工工程技术研究中心开放课题(2015RMZ03);滁州学院校级规划(2016GH10,2016GH11)
摘 要:针对草莓在采后分级生产中存在分级规格不一和效率低下等问题,提出一种基于机器视觉技术的草莓重量与形状分级方法。利用阈值分割法检测草莓果实,提取果实周长和面积参数,通过多元线性回归分析建立草莓重量分级模型;提取果实的低频椭圆傅里叶系数作为形状特征参数,并对支持向量机进行训练,建立草莓形状分级模型。选用200个草莓样本进行试验,结果表明:重量分级正确率为89.5%,形状分级正确率为96.7%,平均运算时间分别为64ms和39ms。试验验证了该方法的鲁棒性和实时性。To deal with the classification problems of strawberry in production,a machine vision based strawberry weight and shape grading method was proposed.The strawberry image was segmented by thresholding to extract the fruit.The area and perimeter parameters of the fruit were then calculated and used to build the strawberry weight grading model through regression analysis.Elliptic Fourier descriptor was used to extract the shape features of the fruit,and these shape features were applied to train a support vector machine(SVM) which represented the strawberry shape grading model.200 samples of strawberries were selected to test both models,and the results showed that the weight grading accuracy was 89.5%,the shape grading accuracy was 96.7%,and the average calculation time were 64 ms and 39 ms,respectively.Therefore,the approaches for grading strawberries were robust and effective.
关 键 词:机器视觉 分级 草莓 凸包 椭圆傅里叶描述子 支持向量机
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.220.216.164