检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:梁莹 LIANG Ying(College of Basic Education,Xi’an Siyuan University,Xi’an 710038,China)
出 处:《自动化与仪表》2023年第6期10-13,28,共5页Automation & Instrumentation
基 金:2019年度校级科学研究计划项目(1125KY01)。
摘 要:为避免光线、姿势、表情等因素影响高光谱人脸识别效果,该文设计了基于MTCNN的高光谱人脸识别自动控制系统。高光谱人脸采集模块通过人工智能机摄像头和CMOS图像传感器采集高光谱人脸图像样本,采用分块局部二值模式提取高光谱人脸局部特征,将其输入Ada-Boost支持向量机,得出高光谱人脸不同区域的最优谱带;利用多任务卷积神经网络,提取最优谱带特征,由Softmax函数完成高光谱人脸分类识别,将识别结果存储至人脸库构建模块,并呈现在手机智能终端。实验结果表明,该系统的高光谱人脸图像采集和LBP特征提取的应用效果较好,可有效识别出不同光线条件和光照角度下高光谱人脸,平均识别率高达99.36%。In order to avoid the influence of light,posture,expression and other factors on the effect of hyperspectral face recognition,a hyperspectral face recognition automatic control system based on MTCNN is designed.The hyperspectral face acquisition module collects the hyperspectral face image samples through the camera of artificial intelligence machine and CMOS image sensor,and extracts the local features of hyperspectral face using the block local binary mode,and injects them into Ada-Boost support vector machine to obtain the optimal spectral bands of different regions of hyperspectral face.The multi-task convolutional neural network was used to extract the optimal spectral band features,and the Softmax function was used to complete the hyperspectral face classification and recognition.The recognition results were stored in the face library building module and displayed in the mobile phone smart terminal.The experimental results show that the system is effective in the application of hyperspectral face image acquisition and LBP feature extraction.The system can effectively recognize the hyperspectral face under different light conditions and lighting angles,and the average recognition rate is up to 99.36%.
关 键 词:人工智能机 高光谱 人脸识别 自动控制系统 图像传感器 谱带优选
分 类 号:TP31[自动化与计算机技术—计算机软件与理论]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.144.125.201