检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
出 处:《计算机科学》2006年第4期159-161,共3页Computer Science
基 金:重庆市自然科学基金项目(编号2005BB2224)资助
摘 要:无论是 Boosting 还是 Bagging 算法,在使用连续样本集进行分类器集合学习时,均需缓存大量数据.这对大容量样本集的应用不可行。本文提出一种基于贝叶斯集合的在线学习算法 BEPOL,在保持 Boosting 算法加权采样思想的前提下,只需对样本集进行一次扫描,就可实现对贝叶斯集合的在线更新学习。算法针对串行训练时间长、成员相关性差的缺点.采用了并行学习的思想,通过将各贝叶斯分量映射到并行计算结构上,提高集合学习的效率。通过UCI 数据集的实验表明,算法 BEPOL 具有与批量学习算法相近的分类性能和更小的时间开销,这使得算法对某些具有时间和空间限制的应用,如大型数据集或连续型数据集应用尤其有效。In situation where data is being generated continuously, storing large amounts of data is impractical for Boosting and Bagging algorithms. In this paper, we present a parallelizable Bayesian ensemble online learning algorithm (BEPOL), which follow the method of reweighting in determining the training sets, but only require one pass through the entire sets. In order to improve both the correlation and time efficiency, parallel training based on negative correlation is used in BEPOL for training each individuals in the ensemble. With experiments conducted on several UCI data sets, BEPOL are proven to perform comparably to the Adaboost in terms of classification performance and have the advantage of lower running time for large datasets. This make BEPOL particularly suitable for some applications where large or continuous datasets are available.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.145