检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:段芸杉 吴献文 王瑞瑞[1] 石伟[3] 李怡燃 DUAN Yunshan;WU Xianwen;WANG Ruirui;SHI Wei;LI Yiran(School of forestry,Beijing Forestry University,Beijing 100083,China;Guangdong Polytechnic of Industry and Commerce,Guangzhou 510510,China;Institute of Geographical Sciences and resources,Chinese Academy of Sciences,Beijing 100101,China)
机构地区:[1]北京林业大学林学院,北京100083 [2]广东工贸职业技术学院,广东广州510510 [3]中国科学院地理科学与资源研究所,北京100101
出 处:《测绘通报》2022年第12期131-135,共5页Bulletin of Surveying and Mapping
摘 要:相对于同源影像立体匹配,基于无人机倾斜摄影与近景摄影获取的异源影像在空间特征、视场角及分辨率等方面均存在较大的差异,给影像匹配带来困难。本文利用基于单应性变换的卷积神经网络提取特征点,在匹配阶段采用交叉注意力机制的图神经网络进行特征点匹配。该方法较好地克服了异源影像间因存在较大视差和扭曲变换而导致的匹配效果较差的问题,并以河北省廊坊市大城县的马家祠堂为试验数据,对比传统SURF(加速稳定性征)算法与深度学习算法的匹配效果。结果表明,基于深度学习算法对存在大视角差异的异源影像的匹配率更高。Compared with stereo matching of homologous images,there are great differences in spatial features,field angle and resolution between the heterogenous images acquired by UAV tilt photography and close-range photography,which bring difficulties to image matching.In this paper,the feature points are extracted by using the convolutional neural network based on homography transformation,and the graph neural network with cross-attention mechanism is used to match the feature points in the matching stage.It overcomes the problem of poor matching effect caused by large disparity and distorted transformation between different images.In this paper,Ma jiacitang in Dacheng county,Langfang city,Hebei province is taken as the experimental data,and the matching effect of traditional SURF(accelerated robust feature)algorithm and deep learning algorithm is compared.The results show that the algorithm based on deep learning has a higher matching rate for heterologous images with large perspective differences.
关 键 词:SURF算法 异源影像 单应性矩阵 图神经网络 特征匹配
分 类 号:P237[天文地球—摄影测量与遥感]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117