检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李香凝 LI Xiangning(College of Information Engineering,Changchun College of Fiance and Economics,Changchun 130112,Jilin,China)
机构地区:[1]长春财经学院信息工程学院,吉林长春130112
出 处:《流体测量与控制》2023年第6期22-25,32,共5页Fluid Measurement & Control
基 金:长春财经学院校级教育教学研究课题(XY202202)。
摘 要:在传统的课堂行为中融入人工智能,将基于机器视觉的行为识别技术应用于课堂领域,及时掌握学生的课堂学习状态,提高教师课堂教学与学生学习的效率。采用机器视觉的方法,使用高校学生课堂数据集,在时间模块融入基于时间位移的高效视频理解模型(TSM)模块[1],捕获更多时间特征,同时在空间模块中加入坐标注意力机制(CA)模型[2],捕获更丰富的空间特征。高校学生课堂行为识别的准确率相较于TSM模型,TOP⁃1的准确率提高了1.8%。除此之外,与在线高效卷积(ECO)、TSM模型相比较,特别是观察一小部分帧时,提出的模型在UCF⁃101数据集上对视频的性能提供了更高的精度。例如,当观察前10.0%帧时,该模型可以达到91.0%的准确率,比ECO模型准确率高7.6%,比TSM模型准确率高1.0%。The traditional classroom behavior is integrated with artificial intelligence,and the behavior recognition technology based on machine vision is applied to the classroom field,so as to grasp the classroom learning state of students in time,and improve the efficiency of classroom teachers’teaching and students’learning.This paper uses the method of machine vision and the classroom data set of college students to integrate TSM module into time module to capture more time features,while adding CA attention mechanism model into space module to capture more spatial features.Compared with the TSM model,the accuracy of TOP-1 in classroom behavior recognition of college students is increased by 1.8%.In addition,our model also compares the performance of videos on the UCF-101 dataset.Compared with ECO and TSM models,the proposed model provides higher accuracy,especially when only a small fraction of frames is observed.For example,when only the first 10%of frames are observed,our model can achieve 91%accuracy.7.6%higher accuracy than the ECO model and 1%higher accuracy than the TSM model.
关 键 词:机器视觉 课堂行为识别 基于时间位移的高效视频理解模型(TSM) 坐标注意力机制(CA)
分 类 号:G42[文化科学—课程与教学论]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49