检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王立国[1] 商卉 石瑶[1] WANG Liguo;SHANG Hui;SHI Yao(College of Information and Communications Engineering, Harbin Engineering University, Harbin 150001, China)
机构地区:[1]哈尔滨工程大学信息与通信工程学院,黑龙江哈尔滨150001
出 处:《哈尔滨工程大学学报》2020年第5期731-737,共7页Journal of Harbin Engineering University
基 金:国家自然科学基金项目(61675051)。
摘 要:与自然真彩色图像相比,高光谱图像维数高、有标记的数据少。针对传统的分类方法主要利用光谱特征忽略了空间信息的提取的问题,本文提出了一种基于空-谱信息融合的主动学习与标签传递算法相结合的分类框架。基于概率模型的BT(Breaking Ties,BT)策略筛选出具有代表性的未标记样本,作为新的训练样本扩充训练样本集。标签传递算法推测未标记样本真正的类别信息,由分类器进行重新训练。实验表明:在有标签样本不充足的情况下,Indian Pines数据集分类精度达到76.89%,帕维亚大学数据集分类精度为95.23%,优于现有的几种分类算法。在标签样本稀缺的情况下,本文算法可以利用半监督学习与主动学习相结合的方法有效提高分类精度。Hyperspectral image has high dimensional features and limited training samples compared with natural color image.Traditional classification methods mainly use spectral features,and tend to ignore spatial information.In this paper,a novel classification algorithm based on space-spectrum information,active learning and label propagation method is proposed that addressing the above problems.BT(Breaking Ties,BT)technique based on probability model is used to select the most informative samples.After that,the categories of the selected unlabeled samples are predicted by the Label Propagation(LP)algorithm.The obtained samples are added to training set to initialize SVM classifier.In the case of insufficient labeled samples,experiment results shows that the classification accuracy of Indian pines data set is 76.89%,and Pavia university data set is 95.23%.It is better than other hyperspectral classification algorithm.The experiments show that the proposed method can make full use of unlabeled samples and improve classification accuracy under very few labeled samples.
关 键 词:高光谱图像 半监督分类 空谱信息 主动学习 标签传递 主成分分析 GABOR滤波 支持向量机
分 类 号:TP753[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.3