检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王崴[1] 赵敏睿 高虹霓[1] 朱帅 瞿珏[1,2] WANG Wei;ZHAO Minrui;GAO Hongni;ZHU Shuai;QU Jue(Air and Missile Defense College,Air Force Engineering University,Xi'an 710051,China;School of Aeronautics,Northwestern Polytechnical University,Xi'an 710072,China)
机构地区:[1]空军工程大学防空反导学院,西安710051 [2]西北工业大学航空学院,西安710072
出 处:《航空学报》2021年第2期286-296,共11页Acta Aeronautica et Astronautica Sinica
基 金:国家自然科学基金(51675530)。
摘 要:意图识别在人机交互(HCI)领域受到广泛关注,传统人机交互意图识别方法单纯依靠脑电(EEG)或眼动数据,不能很好地利用2种方法优点。为此,提出了一种融合脑电和眼动数据的人机交互意图识别方法,通过采集脑电和眼动信号,进行特征提取,输入机器学习模式识别网络进行意图识别,并基于Dempster-Shafer(D-S)证据理论进行决策层融合得出最终识别结果。招募了20名有效受试者进行交互意图识别实验,结果表明,基于脑电和眼动信号的人机交互意图识别方法识别准确率高于单纯依靠脑电和眼动数据的方法,可为下一步飞行器和武器系统人机交互系统自适应设计提供理论依据和技术支持。Intention recognition has received extensive attention in the field of Human-Computer Interaction(HCI).Traditional HCI intention recognition methods rely solely on an electroencephalogram(EEG)or eye movement data without making full use of the advantages of the two methods.This paper proposes an HCI intention recognition method that fuses EEG and eye movement data.It collects EEG and eye movement signals for feature extraction and inputs them into the network of machine learning and pattern recognition for intent recognition.Based on the Dempster-Shafer(D-S)evidence theory,the fusion of the decision layer is performed to obtain the final recognition result.In this study,20 effective subjects are recruited for interactive intention recognition experiments.Results show that the recognition accuracy of the HCI intention recognition method based on EEG and eye movement signals is higher than that of traditional methods.Therefore,it can provide theoretical and technical support for the adaptive design of the HCI interface in aircraft and weapon equipment systems.
分 类 号:V7[航空宇航科学技术] R857.14[医药卫生—航空、航天与航海医学]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.144.254.237