检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:赵春晖[1] 马博博 ZHAO Chunhui;MA Bobo(College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)
机构地区:[1]哈尔滨工程大学信息与通信工程学院,黑龙江哈尔滨150001
出 处:《沈阳大学学报(自然科学版)》2020年第3期224-232,共9页Journal of Shenyang University:Natural Science
基 金:国家自然科学基金资助项目(61971153,61571145);黑龙江省自然科学基金重点项目(ZD201216).
摘 要:针对传统场景分类算法中,利用低、中层特征不能够很好表达高分辨率遥感图像的场景语义,分类精度较低的缺点,提出了一种视觉词袋法(bag-of-visual words,BOVW)结合卷积神经网络(CNN)的高分辨率遥感图像场景分类方法.首先利用视觉词袋模型对遥感影像提取的局部手工特征进行编码得到中层特征,然后利用卷积神经网络来提取图像的高层特征,将提取到的中、高层特征进行融合,再将融合特征输入到不同核函数的支持向量机(SVM)中进行分类.实验结果表明,融合特征比单一特征更能对遥感影像有效表达.与现有遥感图像场景分类方法相比,本方法能够提高场景分类精度,验证了该方法的有效性.In order to overcome the shortcomings of traditional scene classification algorithm that the low-level and middle-level features cannot well express the scene semantics of high-resolution remote sensing images and the classification accuracy is low,a high-resolution remote sensing image scene classification method is proposed based on bag-of-visual words in combination with convolution neural network.Firstly,the middle-level features of remote sensing image are extracted by using visual word package model,then the high-level features of image are extracted by using different convolution neural network,and the extracted middle and high-level features are fused,and then the fused features are input into support vector machine with different kernel functions for classification.According to the experimental results,fusion feature can express remote sensing image more effectively than single feature.The proposed method can effectively improve classification accuracy and verify the effectiveness in comparison with the existing remote sensing image scene classification methods.
关 键 词:高分辨率遥感图像 场景分类 BOVW算法 CNN 特征融合
分 类 号:TP751.1[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222