检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张睿思 潘烨 Zhang Ruisi;Pan Ye(School of Electronic Information and Electrical Engineering,Shanghai Jiao Tong University,Shanghai 200240)
机构地区:[1]上海交通大学电子信息与电气工程学院,上海200240
出 处:《计算机辅助设计与图形学学报》2022年第5期675-682,共8页Journal of Computer-Aided Design & Computer Graphics
基 金:国家自然科学基金青年科学基金(62102255);国家重点研发计划(2019YFC1521104);国家社科基金重大项目(I8ZD22);上海市科委青年科技英才扬帆计划(20YF1421200)。
摘 要:人脸动作捕捉不仅需要对人脸几何信息进行模拟,而且需要准确传达人脸表情.传统的人脸动作捕捉技术,如ARkit,基于人脸的几何信息对面部表情进行捕捉,但是很难让观众体验到角色表情变化.而最近的基于情绪的动作捕捉技术,如ExprGen,考虑使用人脸情绪进行面部捕捉,但很难对角色脸部细节进行刻画.为此,提出将人脸几何信息和表情结合的方法,对动画角色进行控制.首先,通过训练神经网络识别人脸和动画角色表情,对人脸和动画数据集图像进行匹配.然后,通过训练端到端神经网络,提取角色表情信息,获得动画角色骨骼参数.最后,结合人脸几何信息对脸部关键点骨骼参数进行修正.通过对不同人脸输入,生成角色表情定性分析;用4个演员视频作为输入,带动角色运动的吸引力和强度定量分析证明了方法的准确性和实时性.Animating 3D character rigs from human faces requires both geometry features and facial expression information.However,traditional animation approaches such as ARkit failed to connect character storytelling to the audience because the character expressions are hard to recognize.However,recent emotion-based motion capture techniques,such as ExprGen,consider using facial emotion for facial capture.But it is difficult to characterize the details of the character’s face.A network is proposed to incorporate facial expressions into animation.Firstly,an emotion recognition neural network is used to match human and character datasets.Then,an end-to-end neural network is trained to extract character facial expressions and transfer rig parameters to characters.Finally,human face geometry is utilized to refine rig parameters.Qualitative analysis of the generated character expressions,and quantitative analysis of the attractiveness and intensity of the character expression have demonstrated the accuracy and real-time of the model.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229