非稀疏多核组合的支持向量回归方法  被引量:2

Non-sparse Multiple Kernel Learning Approach for Support Vector Regression

在线阅读下载全文

作  者:胡庆辉[1,2,3] 丁立新[1,3] 刘晓刚[2] 李照奎[1,3] 

机构地区:[1]武汉大学计算机学院,湖北武汉430072 [2]桂林航天工业学院广西高校机器人与焊接技术重点实验室培育基地,广西桂林541004 [3]武汉大学软件工程国家重点实验室,湖北武汉430072

出  处:《四川大学学报(工程科学版)》2015年第4期91-97,共7页Journal of Sichuan University (Engineering Science Edition)

基  金:国家自然科学基金资助项目(11301106);广西自然科学基金资助项目(2014GXNSFAA1183105);广西高校科研重点项目资助(ZD2014147);桂林航天工业学院科研基金资助项目(Y12Z028)

摘  要:为了改善支持向量回归机的性能,提出一种利用多核学习解决回归问题的算法(NS-MKR)。算法对基本核函数的组合系数施加了Lp范数的约束(p>1),以得到组合系数的非稀疏解,并采用了两步优化方法,首先求解基于加权组合核的标准支持向量回归问题,用于学习拉格朗日乘子,然后采用简单的计算,求得基本核函数的组合系数,这2个步骤交替进行,直到满足事先定义的收敛准则。在人工数据集和真实数据集上的实验表明,相对于传统的单核和稀疏多核支持向量回归方法,提出的算法有更好的泛化性能。In order to improve the performance of support vector regression machine,a novel algorithm was proposed to solve the support vector regression problem with multiple kernel leaning method( NS-MKR). Lp-norm constraints were imposed on kernel weights with p 1 to obtain the non-sparse solution. Two-step optimization method was used to solve the modal. Firstly,standard support vector regression based on a weighted mixture kernel was optimized to obtain Lagrange multipliers. Secondly,kernel weights were solved only with some simple calculation. These two steps were executed alternately until some predefined criterions were satisfied. Experiments on artificial datasets and real datasets showed that the proposed algorithm has better performance than the existing ones which based on single kernel or sparse multiple kernel leaning method.

关 键 词:多核学习 支持向量回归 非稀疏核组合 两步优化 

分 类 号:TP301.6[自动化与计算机技术—计算机系统结构]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象