检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]西安电子科技大学计算机学院,陕西西安710071
出 处:《软件学报》2013年第11期2584-2596,共13页Journal of Software
基 金:国家自然科学基金(61072109;61272280;41271447;61272195);新世纪优秀人才支持计划(NCET-12-0919);西安市科技局项目(CXY1341(6));中央高校基本科研业务费专项资金(K5051203020;K5051203001;K5051303016;K5051303018;K505131 00006)
摘 要:AdaBoost是一种重要的集成学习元算法,算法最核心的特性"Boosting"也是解决代价敏感学习问题的有效方法.然而,各种代价敏感Boosting算法,如AdaCost、AdaC系列算法、CSB系列算法等采用启发式策略,向AdaBoost算法的加权投票因子计算公式或权值调整策略中加入代价参数,迫使算法聚焦于高代价样本.然而,这些启发式策略没有经过理论分析的验证,对原算法的调整破坏了AdaBoost算法最重要的Boosting特性。AdaBoost算法收敛于贝叶斯决策,与之相比,这些代价敏感Boosting并不能收敛到代价敏感的贝叶斯决策.针对这一问题,研究严格遵循Boosting理论框架的代价敏感Boosting算法.首先,对分类间隔的指数损失函数以及Logit损失函数进行代价敏感改造,可以证明新的损失函数具有代价意义下的Fisher一致性,在理想情况下,优化这些损失函数最终收敛到代价敏感贝叶斯决策;其次,在Boosting框架下使用函数空间梯度下降方法优化新的损失函数得到算法AsyB以及AsyBL.二维高斯人工数据上的实验结果表明,与现有代价敏感Boosting算法相比,AsyB和AsyBL算法能够有效逼近代价敏感贝叶斯决策;UCI数据集上的测试结果也进一步验证了AsyB以及AsyBL算法能够生成有更低错分类代价的代价敏感分类器,并且错分类代价随迭代呈指数下降.AdaBoost is a meta ensemble learning algorithm. The most important theoretical property behind it is "Boosting", which also plays an important role in cost sensitive learning. However, available cost sensitive Boosting algorithms, such as AdaCost, AdaCl, AdaC2 AdaC3, CSB0, CSB1 and CSB2, are just heuristic. They add cost parameters into voting weight calculation formula or sample weights updating strategy of AdaBoost, so that the algorithms are forced to focus on samples with higher misclassification costs. However, these heuristic modifications have no theoretical foundations. The worst thing is that they break the most important theoretical property of AdaBoost, namely "Boosting". Compared to AdaBoost which converges to optimal Bayes decision rule, those cost sensitive algorithms do not converge to cost sensitive decision rule. This paper studies the problem of designing cost sensitive Boosting algorithms strictly under Boosting theory. First, two new loss functions are constructed by making exponential loss and logit loss cost sensitive. It can be proved that the new loss functions are Fisher consistent in cost sensitive setting, therefore optimizing them finally leads to cost sensitive Bayes decision rule. Performing gradient decent in functional space to optimize these two loss functions then results in new cost sensitive Boosting algorithms: AsyB and AsyBL. Experimental results on synthetic Gaussian data prove that in comparison with other cost sensitive Boosting algorithms, AsyB and AsyBL always better approximate cost sensitive Bayes decision rule. Experimental results on UCI datasets further prove that AsyB and AsyBL generate better cost sensitive classifiers with lower misclassification costs and the misclassification costs decrease exponentially with iterations.
关 键 词:代价敏感学习 贝叶斯决策 Fisher一致性 ADABOOST 二分类
分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.22.41.47