检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]哈尔滨工业大学机电工程学院,黑龙江哈尔滨150001 [2]哈尔滨工业大学空间控制与惯性技术研究中心,黑龙江哈尔滨150001
出 处:《哈尔滨工程大学学报》2011年第5期643-649,共7页Journal of Harbin Engineering University
基 金:国家自然科学基金资助项目(70971030);哈尔滨市科技创新人才研究专项资金项目(2009RFQXG212);中央高校基本科研业务费专项基金资助项目(HIT.NSRIF.2010074)
摘 要:为了准确地对监控场景中的运动目标进行语义上的分类,提出了一种基于聚类的核主成分分析梯度方向直方图和二叉决策树支持向量机的运动目标分类算法.利用背景减法提取运动目标前景区域,并识别出潜在候选运动目标.利用提出的基于聚类的核主成分分析的梯度直方图描述子提取候选运动目标的特征,以较低维数的数据有效地描述运动目标的有效特征.将提取的运动目标特征输入二叉决策树支持向量机,实现多类目标的准确分类.通过在不同视频序列上的实验验证,提出的算法对运动目标进行较好地分类,而且在运算速度方面较传统目标分类方法有了明显的提高.实验结果证明了算法对运动目标分类具有较好的准确性、可靠性和鲁棒性.For the purpose of semantically classifying moving objects accurately in a surveillance scene,a moving objects classification method based on the clustered kernel principal component analysis(CKPCA) of the histogram of oriented gradients(HOG) and support vector machine(SVM) was proposed.Firstly,the moving areas in the foreground were extracted by means of the background subtraction method,and some of them were identified as potential candidates of moving objects.Secondly,the characteristics of the moving objects were obtained by the CKPCA-HOG descriptor,which could describe the moving objects' effective features at a lower data dimension.Finally,the data characteristics were fed into a binary SVM decision tree,and the final multi-class classification results were obtained accurately.After verifying different video sequences,the algorithm was able to classify moving targets very well.Compared with traditional classification methods,the proposed method makes obvious improvement in calculation cost and storage requirements.The experimental results demonstrate that the proposed target classification algorithm has good accuracy,reliability,and robustness.
关 键 词:目标分类 梯度方向直方图 核主成分分析 二叉决策树支持向量机
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.3