检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:杜浩国 张方浩[1] 卢永坤[1] 林旭川[2] 邓树荣[1] 曹彦波[1] DU Haoguo;ZHANG Fanghao;LU Yongkun;LIN Xuchuan;DENG Shurong;CAO Yanbo(Yunnan Earthquake Agency,Kunming 650224,Yunnan,China;Institute of Engineering Mechanics,China Earthquake Administration,Harbin 150080,Heilongjiang,China)
机构地区:[1]云南省地震局,云南昆明650224 [2]中国地震局工程力学研究所,黑龙江哈尔滨150080
出 处:《地震研究》2021年第3期490-498,共9页Journal of Seismological Research
基 金:国家重点研发计划“地震应急全时程灾情汇聚与决策服务技术研究”(2018YFC1504505);云南省地震局“传帮带”项目(CQ3-2021001)联合资助。
摘 要:以无人机获取的震后区域高分辨率遥感影像、DSM数字表面模型为基础,提出多源遥感影像的建筑物震害精细化识别方法。对影像中的地物进行多尺度分割,剔除其它地物,提取出建筑物,并依据光谱、纹理、形状特征进行震后建筑物震害、结构类型以及楼层数识别。将该方法应用于2021年云南漾濞M_(S)6.4地震灾区建筑物震害识别,为灾害损失评估工作提供基础数据。结果表明,与传统的人工震害调查相比,基于多源遥感影像的建筑物信息识别方法速度快、准确率高。Rapid identification of the earthquake damage to buildings in the earthquake-stricken area is of great significance for scientific and effective assessment of losses from earthquake disasters.Based on the high-resolution,remote-sensing images of post-earthquake field investigation obtained by UAV and digital surface model(DSM),we propose an identification method of earthquake damage to buildings based on multi-source,remote-sensing images.In the light of this method,we first do the multi-scale segmentation of the surface feature,then extract buildings’information,and weed out other features.Further,we identify the damage,structures and the floors of the buildings according to the spectrum,texture and shape of the buildings on the images.We apply our method to the identification of the damage to the buildings in the Yangbi M_(S)6.4 earthquake on 21 th,May 2021.The results show that,compared with the traditional manual investigation of the damage in the earthquake-affected areas,our method is more effective and more accurate.
关 键 词:漾濞M_(S)6.4地震 建筑物震害识别 灾害损失评估 DSM数字表面模型 高分辨率影像
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.80