检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:袁延鑫 孙莉[1] 张群[1] YUAN Yan-xin;SUN Li;ZHANG Qun(Information and Navigation College, Air Force Engineering University, Xi'an, Shaanxi 710077, China)
机构地区:[1]空军工程大学信息与导航学院,陕西西安710077
出 处:《信号处理》2018年第5期602-609,共8页Journal of Signal Processing
基 金:国家自然科学基金(61701531);陕西省统筹创新工程-特色产业创新链项目(2015KTTSGY04-06)
摘 要:重要军事设施、交通枢纽、保密机构等场所存在安全隐患,保证这些场所安全是人们面临的严峻问题,因此对人体目标进行身份认证和识别具有重要意义。针对敏感场所内的人体目标身份认证问题,提出了一种基于卷积神经网络和微动特征的身份认证方法。在数据样本较小的情况下,模型训练容易"过拟合"。运用迁移学习的思想,首先用MNIST数据集预训练得到卷积神经网络分类模型,使模型具有抽象特征能力;然后再用人体微动数据集训练模型的分类器以用于分类识别。实验结果表明,该方法在走路样本测试集上达到了较高的识别率。There are security risks of important military facilities,transportation hubs,security agencies and other places.Using radar to collect human body echo signals for feature extraction and target classification is an efficient method to ensure the safety of these places. There is less data on human radar echoes that people can acquire in a limited time. In the case of smaller data samples,the problem of over-fitting is easy in the process of model training. And this will lead to the accuracy of the target classification decreased. In this paper,a method of identity authentication based on convolution neural network and micro-motion feature is proposed. We use the idea of transfer learning. First,the convolution neural network classification model is preliminarily trained with the MNIST data set. We make the model have the ability of abstract the features and enter the model of human micro-motion samples to obtain the feature. And then model classifier is further trained by the human body micro-motion data. Finally,we get the training sample test the accuracy of the model. The experimental results show that the accuracy of this method is greatly improved compared with other approaches.
分 类 号:TN957.51[电子电信—信号与信息处理]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.17.158.61