检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈长伟 周晓峰[1] CHEN Chang-wei;ZHOU Xiao-feng(College of Computer and Information,Hohai University,Nanjing 210098,China;College of Information and Engineering,Nanjing Xiaozhuang University,Nanjing 211171,China)
机构地区:[1]河海大学计算机与信息学院,南京210098 [2]南京晓庄学院信息工程学院,南京211171
出 处:《计算机科学》2021年第9期208-215,共8页Computer Science
基 金:国家自然科学基金(11101216);南京晓庄学院校级科研项目(2019NXY25)。
摘 要:针对协同表示分类器(CRC)计算时间复杂度较高的问题,利用重构系数的大小与样本标签之间的正相关性,提出了局部快速协同表示器并用于人脸识别。首先使用最小二乘法求解L2范数约束下的线性回归问题;然后对重构系数进行筛选,舍弃对分类不利的负重构系数;最后抛弃原CRC算法中的样本重构环节,转而使用最大相似性准则确定测试样本所属分类。该方法利用样本的局部相似性,使识别率得到了一定的提升。同时该方法无需样本重构,求解复杂度大幅度降低。在AR和CMU PIE数据集上的实验结果表明,所提方法的时间复杂度极大幅度优于CRC,且在各种光照、表情、角度等状态下其识别率均高于现有其他相关算法。To solve the problem of high computational time complexity of collaborative representation based classification method(CRC),this paper proposes a local fast collaborative representation based classifier for face recognition by using the positive correlation between the reconstruction coefficient and sample labels.Firstly,the least square method is used to solve the linear regression problem with a L2 norm constraint,and then the negative reconstruction coefficients which are unsuitable for classification are discarded.Finally,the maximum similarity criterion instead of the reconstruction criterion in CRC is adopted to determine the label of the test sample.The proposed method can receive better performance by taking local similarity into account,and consumes much less time without sample reconstruction than CRC.The experimental results on AR and CMU PIE datasets demonstrate that the proposed method consumes much less time than CRC,and can achieve better recognition accuracy than some state-of-the-art methods with varying illuminations,expressions and angles in facial images.
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.249