检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:高海玲 王晓东[1] 章联军[1] 赵伸豪 金建国 GAO Hailing;WANG Xiaodong;ZHANG Lianjun;ZHAO Shenhao;JIN Jianguo(Faculty of Electrical Engineering and Computer Science,Ningbo University,Ningbo 315211,China;Zhejiang DTCT Co.,Ltd.,Ningbo 315048,China)
机构地区:[1]宁波大学信息科学与工程学院,浙江宁波315211 [2]浙江德塔森特数据技术有限公司,浙江宁波315048
出 处:《宁波大学学报(理工版)》2023年第3期16-21,共6页Journal of Ningbo University:Natural Science and Engineering Edition
基 金:浙江省自然科学基金(LY20F010005);宁波市“科技创新2025”重大专项(2022T005).
摘 要:为解决现有多数视频人体动作识别3D卷积方法无法区分信息中各维度的重要和非重要特征问题,提出了通过门控循环单元(Gated Recurrent Unit,GRU)和空间注意力增强模块构建时空特征处理网络的方法,基于多级特征融合和多组通道注意力特征选择构建网络,改进基础网络模型ResNet3D对视频人体动作识别中的网络模型.改进后模型在2个公开数据集UCF101和HMDB51上的准确率分别为96.42%和71.08%,与C3D、Two-stream等网络模型相比,具有更高的识别准确率.Video human motion recognition research has great potential for applications,but the modeling quality is greatly affected by movement types,environmental differences and other factors.Most 3D convolution methods for video human motion recognition cannot distinguish between important and non-important features in each dimension given the needed information.To tackle this problem the GRU gating unit and spatial attention enhancement module are used to build a spatio-temporal feature processing network,and the network is built based on multi-level feature fusion and multi-channel attention feature selection.Based on the basic network model ResNet3D,the network model in video human motion recognition is improved.The model achieves 96.42%and 71.08%recognition accuracy on two public datasets UCF101 and HMDB51,respectively,with satisfactory recognition performance.Compared with C3D,two-stream and other generic network models,the proposed model shows higher recognition accuracy,which indicates the effectiveness of the proposed model.
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.117.107.97