检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王少杰 武文波[2] 徐其志 WANG Shaojie;WU Wenbo;XU Qizhi(Beijing University of Chemical Technology,Beijing 100029,China;Beijing Institute of Space Mechanics&Electricity,Beijing 100094,China;Beijing Institute of Technology,Beijing 100081,China)
机构地区:[1]北京化工大学,北京100029 [2]北京空间机电研究所,北京100094 [3]北京理工大学,北京100081
出 处:《航天返回与遥感》2021年第5期76-84,共9页Spacecraft Recovery & Remote Sensing
基 金:国家自然科学基金面上项目(61972021,61672076)。
摘 要:光学遥感成像分辨率高、幅宽大,相似地物多,在图像配准中极易产生特征点误匹配。现有深度网络配准方法直接将特征图中的极大值点作为图像配准的特征点,特征点提取与匹配的准确性差,导致图像精度低。针对该问题,文章提出新方法,将高斯差分图像(Difference of Gaussian,DoG)与Visual Geometry Group(VGG)网络组合起来,构成一个新网络,即Difference of Gaussian with VGG(DVGG)网络;然后从高斯差分图像中提取极大值点作为配准的特征点,将DVGG网络提取的特征图作为特征点的特征描述,用于计算两幅图像特征点匹配的相似度。最后,利用Google Earth软件获取的遥感影像,进行了实验验证,并与尺度不变特征变换(Scale Invariant Feature Transform,SIFT)和加速稳健特征(Speeded Up Robust Features,SURF)方法进行了对比。实验结果表明:新方法的图像配准精度高,优于对比方法。At present, optical remote sensing imaging has the characteristics of high resolution, large coverage width and many similar ground objects, so it is very easy to produce feature point mismatch in image registration. The existing deep network registration methods directly take the maximum points in the feature map as the feature points of image registration. The accuracy of feature points extraction and matching is poor,which leads to low image accuracy. To solve this problem, this paper proposes a new method combining the Difference of Gaussian(DoG)and Visual Geometry Group(VGG)network to form a new network, that is, the Difference of Gaussian with VGG(DVGG)network. The maximum points are extracted from the Gauss difference images as the registration feature points, and the feature map extracted from the DVGG network is used as the feature description of the feature points, which is used to calculate the similarity between the two image feature points. The remote sensing images obtained by the Google Earth software are experimentally verified and compared with scale invariant feature transform(SIFT)and accelerated up robust features(SURF).The experimental results show that this new method has high accuracy and is superior to the contrast methods.
分 类 号:TP751[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15