检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:郭迎春 闫帅帅 刘依 GUO Yingchun;YAN Shuaishuai;LIU Yi(School of Artificial Intelligence,Hebei University of Technology,Tianjin 300401,China)
机构地区:[1]河北工业大学人工智能与数据科学学院,天津300401
出 处:《河北工业大学学报》2021年第1期20-27,共8页Journal of Hebei University of Technology
基 金:河北省研究生创新资助项目(CXZZSS2018031)。
摘 要:针对目前表情生成网络中存在的人脸表情失真、不同帧间图像明暗差异明显的问题,提出一种基于递归双对抗网络模型的人脸表情生成框架。首先通过提取深度人脸特征并生成表情特征图,将其作为监督信号,生成人脸表情种子图像;然后使用生成的种子图像和原始目标人脸一起作为输入,生成特征保持图像,作为当前帧的输出,同时该特征保持图像也作为下一帧种子图像生成的输入;最后,将种子图像生成网络和特征保持图像生成网络递归进行下一帧图像的生成,多次递归得到与原始输入表情一致的特征保持人脸表情视频序列。在CK+和MMI数据库上的实验结果表明,提出的方法能够生成清晰自然的人脸表情视频帧,且在目标人脸形状和驱动的表情特征图像有较大形状差异时具有鲁棒性。Aiming at the problem of facial distortion of animated video generated in current facial expression generation network and large differences of light between different frames,a recursive facial expression synthesis framework based on dual network model is proposed.Firstly,deep facial features are extracted to construct facial feature maps,which are used as supervisory signals to generate seed images of facial expressions.Then,the generated seed images together with the original target faces are used as the input information to generate feature-preserving images as the output of the cur⁃rent frame,and also the output feature-preserving image is used as the input of the next seed image generation.Finally the next frame image is generated recursively to synthesize the facial expression sequence which features consistent with the original input expression.Experimental results on CK+and MMI database show that the proposed method can gener⁃ate clear and naturai seed image and feature-preserving image,and it is robust when there is a large difference between the shape of the target face and the shape of the driven facial expression image.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.200