检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:龚树凤[1] 林超 闫鑫悦 吕晓哲 吴哲夫[1] GONG Shufeng;LIN Chao;YAN Xinyue;LÜXiaozhe;WU Zhefu(College of Information Engineering,Zhejiang University of Technology,Hangzhou Zhejiang 310012,China)
机构地区:[1]浙江工业大学信息工程学院,浙江杭州310012
出 处:《传感技术学报》2023年第7期1055-1063,共9页Chinese Journal of Sensors and Actuators
基 金:浙江省自然科学基金重点项目(LZ22F010005);浙江省教育厅科研项目(Y201839636)。
摘 要:人群计数广泛应用于公共安防、视频监控等领域,但由于目标遮挡、背景干扰以及人群尺度变化等因素的影响,人群计数模型的准确率有所降低。基于深度学习卷积神经网络架构,提出了一种基于多尺度感知和图像关联的人群计数方法。其中,多尺度感知模型包括初级特征提取网络、多尺度特征提取模块、特征融合模块和一个后段架构用来提取图像的多尺度特征,从而适应尺度的变化;而图像关联模型使用特征关联模块和融合模块将输入图像与相干图像进行联系,通过学习图像之间的深层关联性来提升预测密度图的质量。在ShanghaiTech Part_A、Part_B和UCF_CC_50等公开数据集上的实验结果表明,提出的方法在MAE、RMSE和SSIM三项指标上均有较好性能。Crowd counting is widely used in public security,video surveillance and other fields,but due to factors such as target occlu-sion,background interference,and crowd size changes,the accuracy of crowd counting models is reduced.Based on a deep learning con-volutional neural network architecture,a crowd counting method according to multi-scale perception and image association is proposed.The multi-scale perception model includes a primary feature extraction network,a multi-scale feature extraction module,a feature fusion module and a back-end architecture to extract the multi-scale features of the image,so as to adapt to the change of scale,while the im-age correlation model uses the feature correlation module and the fusion module to associate input images with coherent images,and im-prove the quality of predicted density maps through learning deep correlations between images.The experimental results on public data-sets such as ShanghaiTech Part_A,Part_B and UCF_CC_50 show that the proposed method has good performance on three indicators of MAE,RMSE and SSIM.
分 类 号:TN911.73[电子电信—通信与信息系统] TP183[电子电信—信息与通信工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222