检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:胡博[1] 鞠洪波[1] 刘华[1] 郝泷 刘海[1]
机构地区:[1]中国林业科学研究院资源信息研究所,北京100091
出 处:《林业科学研究》2017年第2期194-199,共6页Forest Research
基 金:国家863计划课题(2012AA102001)
摘 要:[目的]利用遥感影像的时效性和宏观性特点,基于证据理论组合多分类规则的方法快速和高效地实现大区域植被遥感分类。[方法]首先,依据辨识框架的概念设计分类系统,并采用大区域样本快速采集方法提取训练样点;其次,将多个单分类规则得到的植被类型特征影像归一化处理为基本概率赋值作为表达对各类型信任程度的证据源数据,再将不同证据源的信任度信息依据证据理论组合;再次,将组合结果依据最大信任度原则确定植被类型;最后,在中国植被图与中国土地覆盖图的类型一致区域随机布点作为验证样本。[结果]各单分类器分类结果的总体精度范围为60%70%,两两规则组合分类结果的总体精度范围为70%80%,3个规则组合分类结果的总体精度达到80.84%。[结论]组合多分类规则的证据理论分类方法可以提高分类精度;参与组合的单分类器精度越高,相关证据源越多,组合分类结果精度越高。Based on the evidence theory principle, the research will realize a combination of multiple classifiers quickly and efficiently for large scale vegetation types classification according to the temporal and the extensive features of remote sensing images. [Method]The classification system imitated the frame of discernment concept and extracted training samples with quick sampling obtaining method for large area vegetation. Taking the feature images of vegetation types obtained by different single classifiers as evidence sources, the feature images were normalized to the basic probability assignment for expressing the credibility and the basic probability assignments were combined based on the combination rules of evidence theory. The combination results were classified by cumulative belief value principle.[Result]The single classifier's accuracy range was 60%~70% while the pairwise combinatorial classifier's accuracy range was 70%~80%, but the combination of three classifiers accuracy was 80.84%. [Conclusion]The results showed that multi-classification based on evidence theory can improve the classification precision. The higher the single classifier's accuracy and the more the related evidence sources, the higher the classification results' accuracy would be.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222