检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Ming-Hao YANG Jian-Hua TAO
机构地区:[1]National Laboratory of Pattern Recognition Institute of Automation,Chinese Academy of Sciences,Beijing 100190,China [2]The Center for Excellence in Brain Science and Intelligent Technology of Chinese Academy of Sciences,Beijing 100190,China [3]University of Chinese Academy of Sciences,Beijing 100049,China
出 处:《Virtual Reality & Intelligent Hardware》2019年第1期21-38,共18页虚拟现实与智能硬件(中英文)
基 金:the National Natural Science Foundation of China(61873269,61425017,61332017,61831022);the National Key Research&Development Plan of China(2017YFB1002804).
摘 要:In multimodal human computer dialog,non-verbal channels,such as facial expression,posture,gesture,etc,combined with spoken information,are also important in the procedure of dialogue.Nowadays,in spite of high performance of users*single channel behavior computing,it is still great challenge to understand users1 intention accurately from their multimodal behaviors.One reason for this challenge is that we still need to improve multimodal information fusion in theories,methodologies and practical systems.This paper presents a review of data fusion methods in multimodal human computer dialog.We first introduce the cognitive assumption of single channel processing,and then discuss its implementation methods in human computer dialog;for the task of multi-modal information fusion,serval computing models are presented after we introduce the principle description of multiple data fusion.Finally,some practical examples of multimodal information fusion methods are introduced and the possible and important breakthroughs of the data fusion methods in future multimodal human-computer interaction applications are discussed.
关 键 词:Intention understanding Multimodal human computer dialog
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.21.122.130