检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张美玉[1] 项小雨 姜晨 简琤峰[1] ZHANG Mei-yu;XIANG Xiao-yu;JIANG Chen;JIAN Cheng-feng(Computer Science and Technology College, Zhejiang University of Technology, Hangzhou 310023, China)
机构地区:[1]浙江工业大学计算机学院数字媒体技术研究所,杭州310023
出 处:《小型微型计算机系统》2018年第7期1574-1578,共5页Journal of Chinese Computer Systems
基 金:国家自然科学基金面上项目(61672461;61672463)资助
摘 要:如何在存在噪声和类肤色背景的环境中进行高效精确的手部检测,是手部检测研究的一大问题.提出一种基于改进ACF的检测方法.该算法在基于多色彩空间肤色模型和边缘直方图基础上对ACF特征进行改进,以多色彩空间肤色模型来突出肤色物体与非肤色物体之间的差异,以边缘直方图来描述物体间的边界信息.同时为提高算法性能,对物体检测算法框架进行了改进.一方面改进特征计算过程,对于每个图像只进行一次特征计算;另一方面利用Edge Boxes获取候选窗口,以此减少候选窗口的数量.最后使用Xgboost对每个候选窗口对应的特征进行判别.实验证明,在存在高斯噪声和受人脸干扰的情况下,该方法可以有效地进行手部检测.How to detect hand in the presence of noise and skin-like background is a big problem in hand detection. Aiming at this problem,a detection method based on improved ACF feature is proposed in this paper. Firstly,the ACF feature is improved based on the HOE and the multi-colorspace skin model,the multi-colorspace skin model is used to highlight the difference between the skin color object and the non-skin color object,and the HOE is used to describe the boundary information between objects. In order to improve the performance of the algorithm,this paper improves the frame of object detection algorithm. On the one hand,the feature calculation is adjusted,and for each image,the algorithm performs only one feature calculation. On the other hand,the Edge Boxes algorithm is used to get the proposals,so as to reduce the number of candidate windows. And then,the corresponding features of each proposal would be judged by Xgboost. Finally,it is proved by experiments in the presence of noise and skin-like background,the method can effectively carry out hand detection.
关 键 词:改进ACF 边缘直方图 多色彩空间肤色模型 EDGE BOXES Xgboost
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.151