检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]重庆邮电学院计算机科学系,重庆400065 [2]重庆交通学院计算机科学系,重庆400074
出 处:《计算机仿真》2006年第10期92-94,150,共4页Computer Simulation
摘 要:朴素贝叶斯分类器是一种广泛使用的分类算法,其计算效率和分类效果均十分理想。但是,由于其基础假设“朴素贝叶斯假设”与现实存在一定的差异,因此在某些数据上可能导致较差的分类结果。现在存在多种方法试图通过放松朴素贝叶斯假设来增强贝叶斯分类器的分类效果,但是通常会导致计算代价大幅提高。该文利用特征加权技术来增强朴素贝叶斯分类器。特征加权参数直接从数据导出,可以看作是计算某个类别的后验概率时,某个属性对于该计算的影响程度。数值实验表明,特征加权朴素贝叶斯分类器(FNVNB)的效果与其他的一些常用分类算法,例如树扩展朴素贝叶斯(TAN)和朴素贝叶斯树(NBTree)等的分类效果相当,其平均错误率都在17%左右;在计算速度上,FWNB接近于NB,比TAN和NBTree快至少一个数量级。Naive Bayesian classifiers are widely used in machine learning due to their computational efficiency and competitive accuracy. However, their conditional attribute independence assumption can result in bad performance in real world problems. A number of techniques have explored the simple relaxations of the attribute independence assumption to increase accuracy, but always cost much more computing time. In this paper, we investigate enhancement of naive Bayes classifier using feature weighting technique. The feature weighting coefficients are directly induced from dataset, and can be regarded as the significance of each attribute when evaluating the posterior probability of the particular class value. Experiment results show that the new algorithm — Feature Weighting Naive Bayes(FWNB) can reach the same classification performance as state - of - the - art classifiers like TAN and NBTree, and all the mean error rate are around 18 percent, but the train time of FWNB is reduced at lear one quantitative level.
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.222.143.148