检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:顾亮 于莲芝[1] GU Liang;YU Lianzhi(School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology,Shanghai 200093,China)
机构地区:[1]上海理工大学光电信息与计算机工程学院,上海200093
出 处:《计量学报》2024年第6期795-805,共11页Acta Metrologica Sinica
基 金:国家自然科学基金(61603257)。
摘 要:提出了适用于嵌入式系统并融合深度可分离卷积神经网络与双向门控循环单元的DSConvBiGRU网络模型,将其用于动态手势序列的分类,设计并实现了一种使用低分辨率热电堆阵列传感器的动态手势识别解决方案,构建了动态手势数据集并在公开网站发布,完成了预训练网络模型在Raspberry Pi边缘端的部署。系统对传感器输出的连续20个温度矩阵进行区间映射、背景减除、Lanczos插值和Otsu二值化预处理得到单个动态手势序列,再由预训练的DSConvBiGRU网络进行分类。实验结果表明:网络模型在测试集上识别准确率为99.291%,在边缘端预处理耗时5.513 ms,推理耗时8.231 ms,该系统满足低功耗、高精度和实时性的设计需求。The DSConvBiGRU network model which is suitable for embedded systems and combines depthwise separable convolutional neural networks and bidirectional gated recurrent units for the classification of dynamic gesture sequences is proposed.A dynamic gesture recognition solution which utilizes a low-resolution thermopile array sensor is designed and implemented.An experimental dataset comprising various dynamic gestures has been constructed and publicated on open website.The deployment of the pre-trained network model on the Raspberry Pi edge device has been accomplished.The system preprocesses 20 consecutive temperature matrices exported by the sensor through interval mapping,background subtraction,Lanczos interpolation,and Otsu thresholding to obtain a single dynamic gesture sequence.Subsequently,the pre-trained DSConvBiGRU network is employed for the classification.Experimental results demonstrate that the network model achieves an accuracy of 99.291%on test dataset.The time comsunption of preprocess and inference on the edge device is 5.513 ms and 8.231 ms respectively.The system meets the design requirements for low-power consumption,high precision,and real-time performance.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7