改进的基于排序熵的有序决策树算法  被引量:10

Improved Ordinal Decisions Trees Algorithms Based on Rank Entropy

在线阅读下载全文

作  者:陈建凯[1] 王熙照[1] 高相辉[1] 

机构地区:[1]河北大学数学与计算机学院河北省机器学习与计算智能重点实验室,保定071002

出  处:《模式识别与人工智能》2014年第2期134-140,共7页Pattern Recognition and Artificial Intelligence

基  金:国家自然科学基金(No.61170040);河北省自然科学基金(No.F2013201110;F2013201220)资助项目

摘  要:由于基于排序熵的有序决策树在扩展属性选取时,需计算每个条件属性的每个割点处的排序互信息,并通过对比这些排序互信息的大小来确定最大值(最大值对应的属性为扩展属性),计算复杂度较高.针对此问题,文中将割点分为平衡割点和非平衡割点两部分,建立一个数学模型,从理论上证明排序互信息最大值不会在平衡割点处达到,而只能在非平衡割点处达到.这说明在计算排序互信息时只需遍历非平衡割点,而无需再计算平衡割点处的值,从而使决策树构建的计算效率得到较大程度提高.数值实验验证此结果.When the expanded attributes are selected for decision tree learning based on rank entropy, computing the rank mutual information of every single cut for each of the continuous-valued attributes is required to get the expanded attribute by comparing the values of rank mutual information. Therefore, the computational complexity is high. Aiming at this problem, cut-points are divided into stable and unstable cut-points and a mathematical model is established in this paper. The proposed model theoretically proves that the rank mutual information function achieves its maximum not at stable cut-points, but at unstable cut-points. The result means that in the algorithm only traversing the unstable cut-points is required instead of computing the values of the stable cut-points. Thus, the computational efficiency of building decision trees is greatly improved, which is confirmed by the numerical experimental results.

关 键 词:有序分类 有序决策树 非平衡割点 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象