检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:董兰芳[1] 王建富[1] 夏泽举[1] 倪奎[1] 王亚涛[1] 吴献[1] 覃景繁[2]
机构地区:[1]中国科学技术大学视觉计算与可视化实验室,合肥230027 [2]华为技术有限公司,广东深圳518129
出 处:《小型微型计算机系统》2015年第12期2754-2759,共6页Journal of Chinese Computer Systems
基 金:中国科学技术大学青年创新基金项目(WK0110000013)资助;华为技术有限公司合作项目(ES2100110024)资助
摘 要:基于图像的人脸渐变技术在计算机动画领域应用广泛,但是对不同的人脸图像,复杂的生理特征和色彩差异影响了渐变动画的质量,制约了相关算法的实用性.为实现不同人脸图像之间更加自然流畅的渐变,本文提出一种新方案.首先对输入的多张人脸图像进行预处理,根据检测所得人脸区域的大小对图像重新排序,利用色调处理降低或消除不同人脸图像的色彩差异;然后确定人脸特征点的位置,并根据相邻图像特征点的差异度确定渐变所需的中间帧数目;最后通过基于特征点和扫描线的图像变形技术生成中间帧,从而得到人脸渐变动画.该方法成功地实现了多张人脸图像间的渐变,实验结果显示,生成的渐变动画逼真、自然、流畅.Facial gradient based on images has a wide range of application in computer animation. However, complex physiological fea- tures and huge color difference in different facial images have a serious impact on the quality of gradient animation which further re- stricts the practical application of related technologies. To generate more fluent and natural gradient animation from multiple facial ima- ges, this paper presents a new scheme. Firstly, the input facial images are sorted by the area of detected faces and preprocessed accord- ing to the image hue. In this way, the difference of image sizes and colors are reduced or eliminated. Then, the feature points of facial images after preprocessing are extracted and the frame number between two adjacent images is calculated depending on the difference of image feature points. At last, the in-between frames are generated based on the image morphing algorithm using feature points and scan lines. In this way, the facial animation is achieved. The proposed algorithm has been successfully applied to the gradient of multi- ple facial images. The experimental results shows that the generated animation is realistic, natural and of fluent effects.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.23.60.252