检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张恒鑫 叶颖诗 蔡贤资[1] 魏福义[1] ZHANG Heng-xin;YE Ying-shi;CAI Xian-zi;WEI Fu-yi(College of Mathematics and Informatics,South China Agricultural University,Guangzhou 510642,China;School of Information and Software Engineering,University of Electronic Science and Technology of China,Chengdu 610054,China)
机构地区:[1]华南农业大学数学与信息学院,广东广州510642 [2]电子科技大学信息与软件工程学院,四川成都610054
出 处:《计算机工程与设计》2020年第11期3168-3174,共7页Computer Engineering and Design
基 金:广东省联合培养研究生示范基地基金项目(2800-218178);农业部中华农业科教基金项目(NKJ201803050);华南农业大学大学生创新创业2018年度省级基金项目(201810564099)。
摘 要:为保证视频应用中人体检测和动作识别的效率,提出一种结合传统动作识别方法和深度学习方法的高效高精度识别算法。对视频帧序列进行灰度化和Sobel算子处理,在此基础上,采用级联支持向量机无缝得到局部二值模式描述子,快速检测出人体区域。借助OpenPose得到人体区域中关节点的二维坐标构建骨架模型,提取基于关节向量的多种动作时空特征,采用kNN模型进行动作分类。经过MIT及Weizmann数据库实验,其结果表明,提出算法比传统方式快16倍且人体动作识别算法取得了88.93%的识别率,验证了该算法的高效性与准确性。To ensure the efficiency of human body detection and motion recognition in video applications,an efficient high-precision recognition algorithm combining traditional motion recognition method and deep learning method was proposed.The video frame sequence was grayed and processed using Sobel operator,and the human body region was detected using cascaded SVM classifier.The skeleton model was constructed using OpenPose to obtain the 2D coordinates of the joint points in the human body region.The various kinds of temporal and spatial characteristics of the joint vectors were extracted,and the kNN model was used for fast classification.The proposed human body detection algorithm is 16 times faster than the traditional method,and human action recognition algorithm achieves 88.93%accuracy according to the experiments based on MIT and Weizmann database,verifying the efficiency and accuracy of the algorithm.
关 键 词:动作识别 人体检测 局部二值模式 关节向量 高效
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.200