检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:黄俊杰 胡畅 包嘉琪 常青 HUANG Jun-jie;HU Chang;BAO Jia-qi;CHANG Qing(School of Computer and Artificial Intelligence,Wuhan Textile University,Wuhan Hubei 430200,China)
机构地区:[1]武汉纺织大学计算机与人工智能学院,湖北武汉430200
出 处:《计算机仿真》2024年第5期183-188,共6页Computer Simulation
基 金:青年科学基金项目(12001406)。
摘 要:针对当前密集行人检测任务中小尺寸目标多且密度大、检测精度低,参数量大且不便于部署的问题,基于YOLOv5算法提出一种改进的轻量级密集行人检测算法YOLO-GB。引入Ghost模块,形成轻量级主干网络,减少参数量和计算量,低成本提取图像特征。针对目标尺度变化大的问题,增加一个预测头来检测不同尺度目标,同时引入加权双向特征金字塔网络BiFPN增强特征融合,提升多尺度特征检测精度。最后使用Alpha-IoU替换CIoU作为边框回归损失函数,进一步优化检测精度。采用密集场景人体检测数据集CrowdHuman进行实验,结果表明,YOLO-GB的mAP50达到84.8%,相比YOLOv5s提高1.5%,参数量降低41.2%,模型大小降低39.6%,具有良好的检测精度与实时性。For the current problem of dense pedestrian detection tasks with many small size targets and high density,low detection accuracy,a large number of parameters,and not easy to deploy,this paper proposes an improved lightweight dense pedestrian detection algorithm YOLO-GB based on the YOLOv5 algorithm.The Ghost module is introduced to form a lightweight backbone network,reduce the number of parameters and calculations,and extract image features at low cost.For the problem of a large variety of target scales,a prediction head is added to detect targets of different scales.A weighted bidirectional feature pyramid network BiFPN is introduced to enhance feature fusion and improve the multi-scale feature detection accuracy.Finally,we use Alpha-IoU to replace CIoU as the border regression loss function to further optimize the detection accuracy.Experiments are conducted using the dense scene human detection dataset CrowdHuman,and the results show that the mAP50 of YOLO-GB reaches 84.8%,which is 1.5%higher than YOLOv5s,41.2%lower number of parameters,and 39.6%lower model size,with good detection accuracy and real-time performance.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7