检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:朱雨 郝晓丽[1] 牛保宁[1] 薛晋东 ZHU Yu;HAO Xiao-li;NIU Bao-ning;XUE Jin-dong(College of Computer Science and Technology,Taiyuan University of Technology,Jinzhong 030600,China;Taiyuan Power Supply Company,State Grid Shanxi Power Supply Company,Taiyuan 030000,China)
机构地区:[1]太原理工大学计算机科学与技术学院,山西晋中030600 [2]国网山西省电力公司太原供电公司,山西太原030000
出 处:《计算机工程与设计》2024年第11期3271-3278,共8页Computer Engineering and Design
基 金:国家自然科学基金面上基金项目(62072326)。
摘 要:针对RetinaNet特征金字塔存在噪声、特征融合不充分、融合时信息损失以及边界框回归不够准确、训练样本质量不平衡的问题,提出一种基于改进的RetinaNet目标检测算法。在特征提取模块中加入注意力模块对噪声过滤,将提出的基于差值注意力的多尺度特征融合模块添加到特征提取模块中,以充分融合特征并增强损失信息。将RetinaNet的损失函数替换为CIoU Loss,引入IoU作为加权系数,使模型向真实框与预测框重叠面积更高的方向优化,提升回归的准确性与速度,提高高质量样本的贡献值。实验结果表明,改进后的平均检测精度提高了1.8%。Aiming at the problems of noise,insufficient feature fusion,loss of information during fusion,and insufficiently accurate bounding box regression and unbalanced quality of training samples in RetinaNet feature pyramid,an improved RetinaNet target detection algorithm was proposed.An attention module was added to the feature extraction module to filter the noise,and the proposed multi-scale feature fusion module based on differential attention was added to the feature extraction module to adequately fuse the features and enhance the loss information.The loss function of RetinaNet was replaced with CIoU Loss,and IoU was introduced as a weighting coefficient to optimize the model in the direction of higher overlap area between the true and predicted frames,to improve the accuracy and speed of regression,and to increase the contribution value of high-quality samples.Experimental results show that the improved average detection accuracy is increased by 1.8%.
关 键 词:深度学习 目标检测 噪声过滤 多尺度特征融合 边界框回归优化 注意力机制 差值注意力
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程] TP391.41[自动化与计算机技术—控制科学与工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.185