检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:任胜兵[1] 谢如良 REN Shengbing;XIE Ruliang(School of Software,Central South University,Changsha 410075,China)
机构地区:[1]中南大学软件学院
出 处:《计算机工程》2019年第10期189-195,共7页Computer Engineering
基 金:中南大学研究生自主探索创新项目(1053320170432)
摘 要:在正则化多核学习中,稀疏的核函数权值会导致有用信息丢失和泛化性能退化,而通过非稀疏模型选取所有核函数则会产生较多的冗余信息并对噪声敏感。针对上述问题,基于AdaBoost框架提出一种弹性网型正则化多核学习算法。在迭代选取基本分类器时对核函数的权值进行弹性网型正则化约束,即混合L 1范数和L p范数约束,构造基于多个基本核最优凸组合的基本分类器,并将其集成到最终的强分类器中。实验结果表明,该算法在保留集成算法优势的同时,能够实现核函数权值稀疏性和非稀疏性的平衡,与L 1-MKL和L p-MKL算法相比,能够以较少的迭代次数获得分类精度较高的分类器。In regularization multi kernel learning,the sparse kernel function weight leads to the loss of useful information and the degradation of generalization performance,while selecting all kernel functions through non-sparse models generates more redundant information and is sensitivity to noise.Aiming at these problems,an elastic-net regularization multi kernel learning algorithm based on AdaBoost architecture is proposed.When the basic classifier is selected at each iteration,the weight of the kernel function is added with the elastic-net regularization,that is,mixed L 1 norm and L p norm constraints.The basic classifier are constructed based on multi basic kernel optimal convex combinations,which are integrated into the final strong classifier.Experimental results show that the proposed algorithm can balance the sparsity and non-sparsity of the weight in kernel function while preserving the advantages of the integrated algorithm.Compared with L 1-MKL and L p-MKL algorithms,it can obtain the classifier with higher classification accuracy in fewer iterations.
关 键 词:集成学习 多核学习 弹性网型正则化 弱分类器 稀疏性
分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.16.125.156