检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李祖传[1,3] 马建文[1] 张睿[2,3] 李利伟[1]
机构地区:[1]中国科学院对地观测与数字地球科学中心,北京市100190 [2]中国科学院遥感应用研究所,北京市100101 [3]中国科学院研究生院,北京市100049
出 处:《武汉大学学报(信息科学版)》2010年第12期1449-1452,共4页Geomatics and Information Science of Wuhan University
基 金:国家863计划资助项目(2007AA12Z157);国家自然科学基金资助项目(40901234);中国科学院知识创新工程青年人才领域前沿专项资助项目(O8S01100CX)
摘 要:提出了一种改进的扩展形态剖面导数(P-EDMP)以及一种融合P-EDMP与光谱的分类方法。采用AVIRIS高光谱遥感数据,与融合光谱和扩展形态剖面(EMP)的方法进行对比实验,结果表明,在描述高光谱遥感影像的形态特征上,P-EDMP与EDMP相当,但是P-EDMP的时间复杂度要小;在分类精度上,所提方法要优于融合光谱与EMP的方法。Among classification of hyperspectral imagery,there are some advantages by combining spectral and morphological information,which is a hot research hop.Extended differential of morphological profiles(EDMP) is a kind of feature describing morphological information of multi-channel imagery.The methods adopts a vector ordering method based on distance to extend grey-level mathematical morphology to multivariate morphology,then constructs the corresponding morphological feature.EDMP has already obtained satisfied results in classification of hyperspectral imagery.However,EDMP is time-consuming and ignores spectral information.To overcome these problems,an improved extended differential of morphological profiles(P-EDMP) and a classification method based on fusing P-EDMP and spectrum are proposed.Using AVIRIS hyperspectral imagery as input,comparative experiments between the proposed method and that of fusing spectrum and extended morphological profiles(EMP) are carried out.Experimental results showed that P-EDMP is competitive with EDMP in describing morphological information of hyperspectral imagery,but is less time-consuming.The proposed method is superior to that of fusing spectrum and EMP in terms of classification accuracies.
分 类 号:P237.4[天文地球—摄影测量与遥感] TP753[天文地球—测绘科学与技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222