检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张纵驰 王华伟[1] 周长威 ZHANG Zong-chi;WANG Hua-wei;ZHOU Chang-wei(Nanjing University of Aeronautics and Astronautics,Nanjing 211000,China)
出 处:《航空计算技术》2024年第5期64-68,73,共6页Aeronautical Computing Technique
基 金:国家自然科学基金项目资助(72271123)。
摘 要:针对当前蒙皮外观人工检查效率低且精度不足的问题,提出了蒙皮表面缺陷实时分割算法MSA-YOLO。使用多尺度注意力MSA模块替换YOLOv8 seg网络骨干的C2f模块,改进特征表示的同时实现网络轻量化;在网络的小目标检测层加入eSE注意力机制层,增强小目标缺陷的检测能力;最后,使用Inner CIOU损失函数代替原CIOU损失函数,使用辅助边框加速了样本的收敛过程。制作包含五种蒙皮表面典型缺陷的数据集进行验证,结果显示MSA-YOLO分割算法相较于原算法在目标框(BOX)和掩膜(MASK)的平均精度值(mAP)分别提高了4.6%和5.3%,且检测速度提升了9.1%;与现阶段流行的其他实时分割算法相比有一定的性能优势,对于实现蒙皮表面缺陷自动化分割具有一定意义。In response to the current problems of low efficiency and insufficient accuracy in manual inspection of skin surface,a real time segmentation algorithm MSA-YOLO for skin surface defects is proposed.The multi scale attention MSA module replaces the C2f module of the YOLOv8 seg network backbone to improve feature representation while achieving network lightweight.The eSE attention mechanism layer is added to the small target detection layer of the network to enhance the detection ability of small target defects.Finally,the Inner CIOU loss function is used instead of the original CIOU loss function,and the auxiliary bounding box is used to accelerate the convergence process of samples.A dataset containing five typical defects on the skin surface is created for validation.The results show that the MSA-YOLO segmentation algorithm improves the average precision(mAP)of the target box and mask by 3.7%and 5.3%respectively compared to the original algorithm,and the detection speed is increased by 9.1%.Compared with other popular real time segmentation algorithms at this stage,it has certain performance advantages and is of significance for achieving automated segmentation of skin surface defects.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.70