AdaBoost算法的推广——一组集成学习算法  被引量:9

Ensemble Learning Algorithms:Generalization of AdaBoost

在线阅读下载全文

作  者:付忠良[1,2] 赵向辉[1,2] 苗青[1,2] 姚宇[1,2] 

机构地区:[1]中国科学院成都计算机应用研究所,四川成都610041 [2]中国科学院研究生院,北京100049

出  处:《四川大学学报(工程科学版)》2010年第6期91-98,共8页Journal of Sichuan University (Engineering Science Edition)

基  金:国家高技术研究发展计划资助项目(2008AAO1Z402);四川省重点科技计划资助项目(2007Z01-024;2009SZ0214)

摘  要:针对AdaBoost算法只适合于不稳定学习算法这一不足,基于增加新分类器总是希望降低集成分类器训练错误率这一思想,提出了利用样本权值来调整样本类中心的方法,使AdaBoost算法可以与一些稳定的学习算法结合成新的集成学习算法,如动态调整样本属性中心的集成学习算法、基于加权距离度量分类的集成学习算法和动态组合样本属性的集成学习算法,大大拓展了AdaBoost算法适用范围。针对AdaBoost算法的组合系数和样本权值调整策略是间接实现降低训练错误率目标,提出了直接面向目标的集成学习算法。在UCI数据上的实验与分析表明,提出的AdaBoost推广算法不仅有效,而且部分算法比AdaBoost算法效果更好。Aiming at overcoming the insufficiency that AdaBoost algorithm is only suitable to the unstable learning algorithm,the method of adjusting sample center with its weight was given based on the idea that adding new classifiers is always to reduce the training error of the ensemble classifier.By this method,AdaBoost algorithm could be generalized to be several new ensemble learning methods by combining some stable learning algorithms,such as the one of dynamically adjusting the centers of sample attributes,the one of classifying by weighted distance measurement,and the one of dynamically combining sample attributes.Therefore,the application scope of AdaBoost algorithm was greatly expanded.Different from that the combination coefficients and the adjustment strategy of sample weights in AdaBoost algorithm are indirectly set to reduce the training error,the direct goal-oriented ensemble learning algorithm was given.The experimental analysis on UCI dataset proved that the generalized AdaBoost algorithms are effective and some of them perform better than the ordinary AdaBoost algorithm.

关 键 词:集成学习 ADABOOST 分类器组合 弱学习定理 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象