检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:LAI Yuping PING Yuan HE Wenda WANG Baocheng WANG Jingzhong ZHANG Xiufeng
机构地区:[1]Department of Information Security, North China University of Technology, Beijing 100144, China [2]School of Information Engineering, Xuchang University, Xuchang 461000, China [3]National Research Center for Rehabilitation Technical Aids, Beijing 100176, China
出 处:《Chinese Journal of Electronics》2018年第3期603-610,共8页电子学报(英文版)
基 金:the National Natural Science Foundation of China(No.51335004,No.61363085,No.61303232);the Project of Action Plan Powerful School with Talents in North China University of Technology(No.XN018022);the Project of Science and Technology Innovation Service Capacity Building Project(No.PXM2017-014212-000002);the Program for Science&Technology Innovation Talents in Universities of Henan Province(No.18HASTIT022);the Foundation of Henan Educational Committee(No.16A520025,No.18A520047)
摘 要:As a variant of Finite mixture model(FMM), finite Inverted Dirichlet mixture model(IDMM)can not avoid the conventional challenges, such as how to select the appropriate number of mixture components based on the observed data. Towards easing these issues,we propose a variational inference framework for learning IDMM which has been proved to be an efficient tool for modeling vectors with positive elements. Compared with the conventional Expectation maximization(EM) algorithm commonly used for learning FMM, the proposed approach prevents over-fitting well. Furthermore, it is able to do automatic determination of the number of mixture components and parameters estimation, simultaneously.Experimental results on both synthetic and real data of object detection confirm significant improvements on flexibility and efficiency being achieved.As a variant of Finite mixture model(FMM), finite Inverted Dirichlet mixture model(IDMM)can not avoid the conventional challenges, such as how to select the appropriate number of mixture components based on the observed data. Towards easing these issues,we propose a variational inference framework for learning IDMM which has been proved to be an efficient tool for modeling vectors with positive elements. Compared with the conventional Expectation maximization(EM) algorithm commonly used for learning FMM, the proposed approach prevents over-fitting well. Furthermore, it is able to do automatic determination of the number of mixture components and parameters estimation, simultaneously.Experimental results on both synthetic and real data of object detection confirm significant improvements on flexibility and efficiency being achieved.
关 键 词:Bayesian estimation Mixture models Inverted Dirichlet distribution Variational inference Object detection
分 类 号:O212.8[理学—概率论与数理统计]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229