基于改进YOLOv8的百合地杂草分类识别  

Weed Classification and Recognition on Lily Fields Based on Improved YOLOv8

在线阅读下载全文

作  者:段淳耀 赵霞[1] 程鸿[2] DUAN Chunyao;ZHAO Xia;CHEN Hong(College of In f ormation Science and Technology,Gansu Agricultural University,Lanzhou 730070,China;Vegetable Research Institute,Gansu Academy of Agricultural Sciences,Lanzhou 730070,China)

机构地区:[1]甘肃农业大学信息科学技术学院,甘肃兰州730070 [2]甘肃省农业科学院蔬菜研究所,甘肃兰州730070

出  处:《软件工程》2025年第2期46-51,共6页Software Engineering

基  金:自然科学基金-甘肃省科技计划资助(24JRRA656);2022年横向课题:农产品物资销售模式的数据统计分析(loonG20220201)。

摘  要:为了提高农业自动化杂草检测的效率和准确性,提出了一种基于改进YOLOv8(You Only Look Once version 8)的百合地杂草分类识别方法。针对百合地杂草形态多样、颜色特征复杂且区分度低的难题,引入了TransNext聚合注意力模块和DCNv2(Deformable ConvNet V2)注意力机制,优化了YOLOv8-n模型的特征提取和目标识别性能。通过实施数据增强策略,显著地提升了模型的泛化能力和识别准确性。实验结果表明,改进后的模型在自建数据集上的准确率达到90.1%,相比于原始YOLOv8模型的准确率提高了6百分点,充分展现了其在复杂非结构化背景下进行杂草分类的潜力和应用价值。In order to improve the efficiency and accuracy of automated weed detection in agriculture,this paper proposes a weed classification and recognition method based on an improved YOLOv8(You Only Look Once version 8).To meet the challenges of diverse morphology,complex color features,and low distinguishability of lily field weeds,TransNext aggregation attention module and DCNv2(Deformable ConvNet V2)attention mechanism are introduced to optimize the feature extraction and object recognition performance of the YOLOv8-n model.By implementing a data augmentation strategy,the generalization ability and recognition accuracy of the model are significantly enhanced.Experimental results show that the accuracy of the improved model on the self-built dataset reaches 90.1%,which is 6 percentage points higher than that of the original YOLOv8 model.This fully demonstrates its potential and application value in weed classification under complex unstructured backgrounds.

关 键 词:YOLOv8 杂草识别 深度学习 目标分类 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象