检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王丽君[1] 刘彦戎[1] 王丽静 Wang Lijun;Liu Yanrong;Wang Lijing(Shaanxi Institute of International Trade&Commerce,Xianyang 712046,China;Xi’an Shiyou University,Xi’an 710300,China)
机构地区:[1]陕西国际商贸学院,咸阳712046 [2]西安石油大学,西安710300
出 处:《电子测量与仪器学报》2020年第9期160-166,共7页Journal of Electronic Measurement and Instrumentation
基 金:陕西省教育厅专项科学研究(18JK1041);物联网与智能技术科技创新团队建设项目(SSY18TD05)资助。
摘 要:传统的人员行为识别中,通过人工特征对人员行为进行分类。这些方法仅能够利用较浅层次的特征,其识别准确率有限。提出通过卷积长短时深度神经网络(convolutional long short-term deep neural networks, CLDNN)进行人员行为识别,并用新的GRU门控单元代替传统的LSTM门控单元提高网络效率。利用该网络结构,既可提取惯性数据中多层次特征,也可充分利用时间序列相关性。通过开源数据集的实验证明,该方法相比于传统的卷积网络和基于LSTM门控循环神经网络的识别准确率分别提高了约3%和7%;用GRU门控单元代替LSTM单元后,所需的训练时间和前向的识别时间分别下降了14%和10%。Traditional machine learning based methods use hand-crafted features in the inertial data to achieve the task of human activity recognition(HAR). However, as these features are normally without abstract high-level knowledge, the recognition rate is thus limited. Deep learning based methods, on the other hand, can avoid the aforementioned disadvantage by learning high-level features through labeled data. In this paper, the convolutional long short-term deep neural networks(CLDNN) is adopted for solving the HAR problem. This network has the features of both the convolutional neural network(CNN) and the recurrent neural network(RNN), which can extract features of different levels and can adopt the correlations in time sequences. Moreover, we use the GRU instead of LSTM as the gated cell of the RNN, which can make the network lighter. Through experiments adopting open source data, we can show that our method has 3% and 7% better recognition rate than CNN and RNN respectively, the training time and forward recognition time has decreased by 14% and 10% respectively, if replace the LSTM with GRU cell.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.137.200.242