检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王永惠 曹浩 WANG Yonghui;CAO Hao(College of Mechanical Engineering,Anhui Science and Technology University,Chuzhou 233100,China;College of Information&Network Engineering,Anhui Science and Technology University,Bengbu 233030,China)
机构地区:[1]安徽科技学院机械工程学院,安徽滁州233100 [2]安徽科技学院信息与网络工程学院,安徽蚌埠233030
出 处:《青岛农业大学学报(自然科学版)》2024年第4期301-305,共5页Journal of Qingdao Agricultural University(Natural Science)
基 金:安徽省高等学校自然科学重大研究项目(2022AH040235)。
摘 要:目标识别对实现水果产业采摘自动化至关重要,但在自然环境下传统检测算法对酥梨果实识别效果不好。基于Mask R-CNN(mask region-convolutional neural network)神经网络模型,结合砀山酥梨图像的样本数据库,通过特征金字塔网络提取图像特征,运用RPN(region proposal network)网络处理特征图,对砀山酥梨目标检测效果进行分析。结果表明:采用Mask R-CNN模型检测的准确率为95.54%,召回率为92.79%,误检率为4.45%;Mask R-CNN模型能够在果实被枝叶遮挡、未被枝叶遮挡、果实重叠等场景下精准检测出酥梨图像的完整轮廓。为采摘机器人检测酥梨目标提供了技术支持。Target recognition is of vital importance to picking automation in the fruit industry,but the traditional detection algorithm is not sufficient to recognize pears in the natural environment.Based on the Mask R-CNN(mask region-convolutional neural network)model,and combined with the sample database of Dangshan pear images,image features were extracted by the feature pyramid network(FPN),the feature map was processed by RPN(region proposal network),and then the effectiveness of Dangshan pear target detection was analyzed.Results showed that the accuracy of Mask R-CNN model for Dangshan pear target detection was 95.54%,the recall was 92.79%,and the false rate was 4.45%.This Mask R-CNN model could detect the complete outlines of Dangshan pears accurately in situations where fruits were obstructed by branches and leaves,or not obstructed by branches and leaves,or overlapped,etc.It provides technical support for picking robots to detect pear targets.
关 键 词:砀山酥梨 目标检测 深度学习 Mask R-CNN
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49