检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈益强[1] 高文[1] 刘军发[1] 杨长水[1]
出 处:《计算机学报》2006年第5期822-827,共6页Chinese Journal of Computers
基 金:国家自然科学基金(60303018;60403037);北京市科技新星计划项目基金(2005B54);北京工业大学多媒体与智能软件技术实验室开放课题基金联合资助.
摘 要:利用大量真实多模式行为数据进行学习训练、获取单模式行为的韵律模型以及多模式行为之间的协同关联模型的方法,来实现虚拟人多模式行为之间的协同.重点给出了多模式行为的韵律模型描述,同时给出基于手语韵律参数与语音韵律特征融合的协同控制韵律模型以及韵律参数获取方法,并运用于多模式行为协同控制中,取得了较好的实验结果.与传统的规则法相比,该学习方法更能刻画多模式之间协同关联的复杂性,更好地实现虚拟人多模式行为合成的逼真性.This paper proposes a multi-model behavior synchronizing prosody model and its application to Chinese sign language synthesis. Based on huge realistic multi-model behavior training data, the authors adopt learning the prosody mode for each single channel behavior data and further synchronizing relation model of all models, and present the framework for the multi-mod- el synchronization in virtual human synthesis, including models of sign language, speech, facial expression and lip movement and so on. The formal description of multi-model prosody model is demonstrated in detail. Comparing to traditional regularity approaches, the learning based ap- proach in this paper is more adequate to express complicatedly the multi-model synchronizing rela- tionship, and to synthesize realistically the multi-model behavior of the virtual human. As the example, the synchronizing prosody model involving sign language prosody parameters and speech prosody parameters is given. The authors design an approach to compute the prosody parameters and apply it to control the virtual human's multi-model behavior synchronously. Experiments based on the Coss ("863" speech material library) and Chinese sign language library show that the multi-model behavior synchronizing prosody model works well. It enhances the recognition rate of synthetic sign language by 5.94%.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30