检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:胡庆辉[1,2,3] 丁立新[1,3] 刘晓刚[2] 李照奎[1,3]
机构地区:[1]武汉大学计算机学院,湖北武汉430072 [2]桂林航天工业学院广西高校机器人与焊接技术重点实验室培育基地,广西桂林541004 [3]武汉大学软件工程国家重点实验室,湖北武汉430072
出 处:《四川大学学报(工程科学版)》2015年第4期91-97,共7页Journal of Sichuan University (Engineering Science Edition)
基 金:国家自然科学基金资助项目(11301106);广西自然科学基金资助项目(2014GXNSFAA1183105);广西高校科研重点项目资助(ZD2014147);桂林航天工业学院科研基金资助项目(Y12Z028)
摘 要:为了改善支持向量回归机的性能,提出一种利用多核学习解决回归问题的算法(NS-MKR)。算法对基本核函数的组合系数施加了Lp范数的约束(p>1),以得到组合系数的非稀疏解,并采用了两步优化方法,首先求解基于加权组合核的标准支持向量回归问题,用于学习拉格朗日乘子,然后采用简单的计算,求得基本核函数的组合系数,这2个步骤交替进行,直到满足事先定义的收敛准则。在人工数据集和真实数据集上的实验表明,相对于传统的单核和稀疏多核支持向量回归方法,提出的算法有更好的泛化性能。In order to improve the performance of support vector regression machine,a novel algorithm was proposed to solve the support vector regression problem with multiple kernel leaning method( NS-MKR). Lp-norm constraints were imposed on kernel weights with p 1 to obtain the non-sparse solution. Two-step optimization method was used to solve the modal. Firstly,standard support vector regression based on a weighted mixture kernel was optimized to obtain Lagrange multipliers. Secondly,kernel weights were solved only with some simple calculation. These two steps were executed alternately until some predefined criterions were satisfied. Experiments on artificial datasets and real datasets showed that the proposed algorithm has better performance than the existing ones which based on single kernel or sparse multiple kernel leaning method.
分 类 号:TP301.6[自动化与计算机技术—计算机系统结构]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.188