检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:翟德超 范亚男[3] 周亚男 ZHAI Dechao;FAN Yanan;ZHOU Yanan(Institute of Geographic Sciences and Natural Resources Research, University of Chinese Academy of Sciences,Beijing 100020, China;Department of Geographical Information Science, Hohai University,Nanjing 211100, China;Tianjin Institute of Surveying and Mapping, Tianjin 300381, China)
机构地区:[1]中国科学院大学地理科学与资源研究所,北京100020 [2]河海大学地理信息科学系,南京211100 [3]天津市测绘院,天津300381
出 处:《国土资源遥感》2019年第3期36-42,共7页Remote Sensing for Land & Resources
基 金:国家自然科学基金项目“‘数据—知识’驱动的大区域高分辨率遥感影像多尺度分割并行计算方法”(编号:41501453);中央高校基本科研业务费项目“大区域高分辨率影像多尺度并行分割方法”(编号:2016B11414)共同资助
摘 要:以往的遥感影像多尺度分割方法对边界特征分析运用较少,为此提出了融入边界特征的多尺度加权聚合遥感影像分割方法(edge-incorporated multi-scale image segmentation by weighted aggregation, EIMSSWA)。首先,检测影像梯度特征生成边界图;然后,在基元合并过程中计算相邻基元间公共边界的多种统计特征,并将其同基元的其他区域特征相结合,优化基元间的相似性度量,提高影像多尺度分割结果的精度;最后,通过eCognition软件的多尺度分割、基于加权聚合的影像分割(segmentation by weighted aggregation,SWA)和EIMSSWA等3组实验来验证方法的分割精度。结果表明,EIMSSWA方法能够取得更高精度、更合理的影像分割结果。Some existing remote sensing image segmentation methods do not take the edge feature into consideration, therefore, an edge-incorporated multi-scale segmentation algorithm based on weighted aggregation (EIMSSWA) is proposed. Firstly, the edge features of adjacent primitives are generated by counting the gradient strength and gradient direction on the common edges. Secondly, these features are infused into the similarity measurement of the adjacent primitives in segmentation by weighted aggregation, so as to improve the segmentation. Finally, the segmentation of the proposed method is compared with segmentations of eCognition as well as segmentation by weighted aggregation (SWA) a. The results demonstrate that the EIMSSWA method is capable of gaining more accurate and more reasonable segmentation.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.200