检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:许良凤[1] 王家勇[1] 胡敏[1] 林辉[2] 侯登永 崔婧楠
机构地区:[1]合肥工业大学情感计算与先进智能机器安徽省重点实验室,合肥230009 [2]合肥工业大学电子科学与应用物理学院,合肥230009 [3]华南理工大学,广州510006
出 处:《电子测量与仪器学报》2017年第4期522-529,共8页Journal of Electronic Measurement and Instrumentation
基 金:国家自然科学青年基金(61300119);国家自然科学基金重点项目(61432004);安徽省自然科学基金(1408085MKL16)资助项目
摘 要:提出一种采用局部梯度双树复小波变换(dual-tree complex wavelet transform,DT-CWT)主方向模式(dominant direction pattern,DDP)的人脸表情识别方法。首先,对归一化后的表情图像进行4层DT-CWT,每一层得到8个方向的DT-CWT特征图像,其中包括6个高频方向和2个低频方向,构建一种新的主方向模式(IDDP)对每个DT-CWT特征图像进行编码;然后按照基于梯度方向的融合规则将每一层上IDDP编码特征图融合到一起,将融合图进一步划分为若干个不重叠且大小相等的子块,分别计算每个子块区域的直方图分布,将其联合起来得到人脸表情图像的特征;最后,采用基于Fisher加权的Chi平方概率统计最近邻方法进行分类识别。大量实验表明,算法在识别率和识别时间上都体现出了一定的优势。A novel facial expression recognition is proposed in the paper, in which the local gradient dual-tree complex wavelet transform dominant direction pattel"n is used. Firstly, four layers DT-CWT are used on normalized expression image. For each layer, ,~e can obtain the feature images of eight directions, which include 6 high-frequency directions and 2 low-frequency directions. A new DDP (IDDP) is constructed, and which is used to code for each DT-CWT feature image. Secondly, the IDDP feature images of each layer in different directions are fused based on rules of gradient direction, and every fused image is divided into several non-overlapping and equal-sized blocks. The corresponding histogram of the fused feature in each block is calculated respectively, and the final feature of facial expression image is obtained by cascading all of them. Finally, the nearest neighbor method based on Chi Square statistic weighted by Fisher is used to classify and identify. A large number of experiments show that the proposed method has a certain advantage on the recognition rate and recognition time.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.225.234.109