检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李文斌 何冉 LI Wenbin;HE Ran(College of Information Engineering,Hebei GEO University,Shijiazhuang 050031,China)
机构地区:[1]河北地质大学信息工程学院,石家庄050031
出 处:《计算机工程》2020年第7期268-276,共9页Computer Engineering
基 金:河北省自然科学基金(F2016403055);河北省高等学校科学研究计划项目(ZD2016005)。
摘 要:针对遥感图像飞机检测中存在的背景复杂和目标尺度变化大等问题,提出基于深度神经网络的遥感图像飞机目标检测模型DC-DNN。利用图像底层特征制作像素级标签完成全卷积神经网络(FCN)模型训练,将FCN模型与DBSCAN密度聚类算法相结合选取飞机目标的自适应候选区域,并基于VGG-16网络提取候选区域高层特征以获取飞机目标检测框,同时通过检测框抑制算法剔除重叠框和误检框,得到最终的飞机目标检测结果。实验结果表明,DC-DNN模型对于遥感图像飞机目标检测的准确率、召回率和F1值分别为95.78%、98.98%和0.9735,相比WS-DNN、R-FCN等模型具有更好的检测性能和泛化能力。The airplane target detection of remote sensing images is frequently faced with problems including complex background and great changes of target scales.To address the problems,this paper proposes a model DC-DNN based on deep neural networks for aircraft detection in remote sensing images.The bottom layer features of images are used to make pixel-level labels for the training of Fully Convolutional Neural Network(FCN).The FCN model and DBSCAN algorithm are combined to select the self-adaptive candidate regions of the aircraft target,and the high-level features of the candidate region are extracted based on VGG-16 net to obtain the detection frame of the aircraft target.Also,a new detection frame suppression algorithm is proposed to eliminate overlapping frames and false detection frames to obtain the final detection result of the aircraft target.Experimental results show that the proposed DC-DNN model has the accuracy of aircraft target detection in remote sensing images reaching 95.78%,recall reaching 98.98%,and F1 score reaching 0.9735,and it has better detection performance and generalization capabilities than WS-DNN,R-FCN and other models.
关 键 词:遥感图像 目标检测 密度聚类 卷积神经网络 像素级标签
分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222