一类支持向量机的快速增量学习方法  被引量:6

Fast incremental learning method for one-class support vector machine

在线阅读下载全文

作  者:王洪波[1] 赵光宙[1] 齐冬莲[1] 卢达[1] 

机构地区:[1]浙江大学电气工程学院,浙江杭州310027

出  处:《浙江大学学报(工学版)》2012年第7期1327-1332,共6页Journal of Zhejiang University:Engineering Science

基  金:国家自然科学基金资助项目(60872070);浙江省科技计划资助项目(2008C21141);浙江省科技计划资助项目(2010C33044);浙江省重大科技攻关项目(2010C11069)

摘  要:提出一类支持向量机(OCSVM)的快速增量学习方法.在OCSVM初始分类器的基础上,添加一个德尔塔函数形成新的决策函数,实现增量学习的过程.通过分析德尔塔函数的几何特性,构造出与OCSVM相似的优化目标函数,从而求解德尔塔函数的参数.优化问题能够进一步转化为标准的二次规划(QP)问题,但是在优化过程中Karush-Kuhn-Tucker(KKT)条件发生很大改变.根据新的KKT条件,为QPP提出修正的序贯最小优化(SMO)求解方法.整个学习过程直接操作初始分类器,仅仅训练新增样本,避免了对初始样本的重复训练,因此能够节约大量的学习时间和存储空间.实验结果表明,提出的快速增量学习方法在时间和精度上均优于其他的增量学习方法.A fast incremental learning method of one-class support vector machine (OCSVM) was pro- posed. A new decision function of OCSVM was constructed by adding a delta function based on the initial classifier in order to achieve the incremental learning. The objective function which had the similar form with OCSVM was constructed to solve the parameters of delta function by analyzing the geometric proper- ties of delta function. The optimization problem can be converted into a standard quadratic programming (QP) problem, but the Karush-Kuhn-Tucker(KKT)conditions greatly changed. An improved sequential minimal optimization (SMO) method was proposed according to the new KKT condition. Directly manipu- lating the initial classifier and under its influence, the method only trained the new data, so saved much learning time and storage space. Experimental results show that the fast incremental learning method per- forms better than other incremental methods in both time and accuracy.

关 键 词:一类支持向量机 增量学习 德尔塔函数 二次规划 序贯最小优化(SMO) KKT条件 

分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象