检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:周文芳[1] 杨耀宁 ZHOU Wenfang;YANG Yaoning(Yangtze University College of Arts and Sciences,Jingzhou 434020,China;School of Architecture and Planning,Yunnan University,Kunming 650500,China;Technische Universitaet Berlin,Berlin 10623,Germany)
机构地区:[1]长江大学文理学院,湖北荆州434020 [2]云南大学建筑与规划学院,昆明650500 [3]柏林工业大学,德国柏林10623
出 处:《激光杂志》2024年第2期124-128,共5页Laser Journal
基 金:云南省青年科学基金(No.202001BB050070)。
摘 要:光谱相似图像分类性能过差会增加光谱信息冗余度,降低地物勘探与军事防御等多种领域的光谱探测效率。为了多元素匀质区分光谱信息与光谱曲线,提出考虑关联波段特性的光谱相似图像分类方法。该方法首先利用光谱匹配消除光谱相似图像白色光源过曝现象。然后提取优化图像的关联波段,并将其作为聚类特征输入支持向量机中。最后根据支持向量机的输出结果,实现光谱相似图像分类。实验结果表明,所提方法分类结果清晰度较高,分类误差或像素块填色错误小,混淆矩阵中同行同列矩形块的分类精度较高。The poor classification performance of spectrally similar images will increase the redundancy of spectral information and reduce the spectral detection efficiency in various fields such as ground feature exploration and military defense.In order to distinguish spectral information and spectral curves homogeneously with multiple elements,a spectral similarity image classification method considering the characteristics of associated bands is proposed.The method first uses spectral matching to eliminate overexposure of white light sources in spectrally similar images.The associated bands of the optimized image are then extracted and fed into the support vector machine as clustering features.Finally,according to the output results of the support vector machine,the spectral similarity image classification is realized.The experimental results show that the classification results of the proposed method have high definition,small classification errors or pixel block coloring errors,and the classification accuracy of rectangular blocks in the same row in the confusion matrix is high.
关 键 词:光谱相似图像 光谱匹配 关联波段 聚类特征 支持向量机
分 类 号:TN751[电子电信—电路与系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.200