检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:郑歆慰 胡岩峰[1] 孙显[1,2] 王宏琦[1,2]
机构地区:[1]中国科学院电子学研究所,北京100190 [2]中国科学院空间信息处理与应用系统技术重点实验室,北京100190 [3]中国科学院大学,北京100190
出 处:《电子与信息学报》2014年第8期1891-1898,共8页Journal of Electronics & Information Technology
基 金:高分对地观测领域学术交流项目(GFZX04060103)资助课题
摘 要:针对稀疏表示分类器不能较好地适应多特征框架的问题,该文提出一种空间约束多特征联合稀疏编码模型,并以此实现遥感影像的自动标注。该方法利用l1,2混合范数正则化多特征编码系数,约束编码系数共享相同的稀疏模式,在保持多特征关联的同时,又不添加过于严格的约束。同时,将字典学习技术扩展到多特征框架中,通过约束字典更新的变换矩阵,解决了字典学习过程丢失多特征关联的问题。另外,针对遥感影像中的空间关系常常被忽略或者利用不充分的不足,还提出了将空间一致性与多特征联合稀疏编码相结合的分类准则,提高了标注性能。在遥感公开数据集与大尺寸卫星影像上的实验证明了该方法的有效性。In this paper, a novel framework for remote sensing image annotation is proposed based on spatial constrained multi-feature joint sparse coding to extend the sparse representation-based classifier to multi-feature framework. The proposed framework imposed an l1,2 mixed-norm regularization on encode coefficients of multiple features. The regularization encourages the coefficients to share a common sparsity pattern, which preserves the cross-feature information. Inspired by the success of dictionary learning, a novel dictionary learning model is proposed to promote the performance of multi-feature joint sparse coding, while the cross-feature association is preserved by consistent transformation constraint. In addition, spatial dependencies between patches of remote sensing images are useful for annotation task but usually ignored of insufficiently exploited. In this paper, a spatial relation constrained classifier is designed to incorporate spatial coherence into multi-feature sparse coding model to annotate images more precisely. Experiments on public dataset and large satellite images show the discriminative power and effectiveness of the proposed framework.
关 键 词:遥感图像标注 多特征联合稀疏编码 多特征字典学习 空间信息
分 类 号:TP751[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.3