检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]第二炮兵工程大学信息工程系,陕西西安710025 [2]清华大学电子工程系,北京100084 [3]北京市遥感信息研究所,北京100192
出 处:《光学精密工程》2015年第9期2708-2714,共7页Optics and Precision Engineering
基 金:国家自然科学基金资助项目(No.61132007;No.61202332);中国博士后科学基金资助项目(No.2012M521905)
摘 要:针对高光谱特征的稀疏表示,提出了一种基于多尺度分割的空间加权算法用于高光谱图像分类。该算法采用更合理的邻域定义挖掘空间先验信息,优化类边缘像元的稀疏表示。首先,通过多尺度分割提供邻域空间约束;结合拉普拉斯尺度混合(LSM)先验,分别对每个邻域组内像元进行空间加权的稀疏表示。然后,采用概率支持向量机(SVM)分类,同时提供像元的分类标签及其置信度。最后,以此置信度为权重,对多尺度分类图进行加权融合,生成最终的分类图。实验显示,本文算法能够增强光谱特征表示的稀疏性和鲁棒性,提高总体分类精度;在小样本训练下,单类的分类精度可提升30%左右,表明该算法在高光谱应用中具有较强的实用性。For the sparse representation of hyperspectral characteristics, a spatial weighted algorithm based on multiscale segmentation is proposed for hyperspectral classification. The algorithm uses a more reasonable neighborhood definition to mine spatial prior information to optimize the sparse representation of a like-edge pixel. Firstly, spatial neighborhoods were obtained through multiscale segmentation, and Laplacian Scale Mixture, (LSM) priori was then combined for the spatial-weighted sparse representation of pixels in each neighborhood. Then, the probabilistic Support Vector Machine (SVM) was used to classify the hyperspectral images and to provide classification labels and their confidences. Finally, the multiscale segmentation was weighted by the confidence of each label and the classification map was obtained by the fusion of labels. Experiments show that the algorithm enhances the sparse and roughness characterized by spectral features and improves the classification accuracy. Under smaller sample training, the classification accuracy of single ground surface has increased by 30% ,which verifies the practicability of the proposed algorithm in hyperspectral applications.
关 键 词:高光谱图像分类 光谱稀疏表示 空间先验融合 多尺度策略
分 类 号:TP751[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15