检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:尹儒 门昌骞[1] 王文剑[2] 刘澍泽 YIN Ru;MEN Changqian;WANG Wenjian;LIU Shuze(School of Computer and Information Technology,Shanxi University,Taiyuan 030006;Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education,Shanxi University,Taiyuan 030006;Department of Computer Science,Rensselaer Polytechnic Institute,Troy,NY 12180)
机构地区:[1]山西大学计算机与信息技术学院,太原030006 [2]山西大学计算智能与中文信息处理教育部重点实验室,太原030006 [3]Department of Computer Science,Rensselaer Polytechnic In-stitute,Troy,NY 12180
出 处:《模式识别与人工智能》2018年第7期643-652,共10页Pattern Recognition and Artificial Intelligence
基 金:国家自然科学基金项目(No.61673249)、山西省回国留学人员科研基金项目(No.2016-004)、赛尔网络下一代互联网技术创新项目(No.NGIL20170601)
摘 要:决策树算法采用递归方法构建,训练效率较低,过度分类的决策树可能产生过拟合现象.因此,文中提出模型决策树算法.首先在训练数据集上采用基尼指数递归生成一棵不完全决策树,然后使用一个简单分类模型对其中的非纯伪叶结点(非叶结点且结点包含的样本不属于同一类)进行分类,生成最终的决策树.相比原始的决策树算法,这样产生的模型决策树能在算法精度不损失或损失很小的情况下,提高决策树的训练效率.在标准数据集上的实验表明,文中提出的模型决策树在速度上明显优于决策树算法,具备一定的抗过拟合能力.The decision tree algorithm is constructed in a recursive style. Therefore, the low training efficiency is yielded and the over-classification of decision tree may produce overfitting. An accelerated algorithm called model decision tree (MDT) is proposed in this paper. An incomplete classification decision tree is established via the Gini index on the training dataset firstly. Then a simple model is utilized to classify impure pseudo leaf nodes, which are neither leaf nodes nor in the same class. Consequently, the final MDT is generated. Compared with DT, MDT improves the training efficiency with smaller loss of classification accuracy or even no loss. The experimental results on benchmark datasets show that the proposed MDT is much faster than DT and it has a certain ability to avoid overfitting.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30