检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:LYU Shen-Huan CHEN Yi-He ZHOU Zhi-Hua
机构地区:[1]National Key Laboratory for Novel Software Technology,Nanjing University,Nanjing 210023,China
出 处:《Chinese Journal of Electronics》2022年第6期1072-1080,共9页电子学报(英文版)
基 金:supported by the National Natural Science Foundation of China(61921006)。
摘 要:Deep forest is a tree-based deep model made up of non-differentiable modules that are trained without backpropagation.Despite the fact that deep forests have achieved considerable success in a variety of tasks,feature concatenation,as the ingredient for forest representation learning,still lacks theoretical understanding.In this paper,we aim to understand the influence of feature concatenation on predictive performance.To enable such theoretical studies,we present the first mathematical formula of feature concatenation based on the two-stage structure,which regards the splits along new features and raw features as a region selector and a region classifier respectively.Furthermore,we prove a region-based generalization bound for feature concatenation,which reveals the trade-off between Rademacher complexities of the two-stage structure and the fraction of instances that are correctly classified in the selected region.As a consequence,we show that compared with the prediction-based feature concatenation(PFC),the advantage of interaction-based feature concatenation(IFC)is that it obtains more abundant regions through distributed representation and alleviates the overfitting risk in local regions.Experiments confirm the correctness of our theoretical results.
关 键 词:Deep forest OVERFITTING Generalization bound Representation learning
分 类 号:TP181[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.226.82.161