Multi-style video stylization based on texture advection  被引量:1

Multi-style video stylization based on texture advection

在线阅读下载全文

作  者:TANG Ying ZHANG Yan SHI XiaoYing FAN Jing 

机构地区:[1]School of Computer Science and Technology, Zhejiang University of Technology [2]Key Laboratory of Visual Media Intelligent Process Technology of Zhejiang Province [3]School of Software Engineering, Hangzhou Dianzi University

出  处:《Science China(Information Sciences)》2015年第11期86-98,共13页中国科学(信息科学)(英文版)

基  金:supported by National Natural Science Foundation of China(Grant Nos.61003265,61173097);Zhejiang Provincial Natural Science Foundation of China(Grant No.LY14F020021)

摘  要:Artistic video stylization, which is widely used in multimedia entertainment, transforms a given video into different artistic styles. Most of the existing video stylization algorithms can simulate single or limited video artistic styles. Although some algorithms can achieve multi-style video processing, these algorithms are complex and difficult to implement. To solve this problem, we propose a multi-styled video stylization algorithm based on texture advection, where different artistic styles are synthesized and transferred from user-specified texture samples of desired styles. We use the direction field-guided texture synthesis to compute the texture layer that represents the artistic style. Painterly directional video styles are simulated competently by the orientation changes in the synthesized anisotropic textures. There appeared local distorted region of the texture layer during texture advection under the optical flow field. To address this issue, we propose the texture inpaint to synthesize the limited distorted region and make the stylized video temporally coherent. We also accelerate the video stylization by using the CUDA parallel computing framework that parallelly computes the morphological operations used for video abstraction. Finally, we produce stylized videos of multiple artistic styles with satisfactory experimental results, including the styles of oil painting, watercolor painting and stylized lines drawing.Artistic video stylization, which is widely used in multimedia entertainment, transforms a given video into different artistic styles. Most of the existing video stylization algorithms can simulate single or limited video artistic styles. Although some algorithms can achieve multi-style video processing, these algorithms are complex and difficult to implement. To solve this problem, we propose a multi-styled video stylization algorithm based on texture advection, where different artistic styles are synthesized and transferred from user-specified texture samples of desired styles. We use the direction field-guided texture synthesis to compute the texture layer that represents the artistic style. Painterly directional video styles are simulated competently by the orientation changes in the synthesized anisotropic textures. There appeared local distorted region of the texture layer during texture advection under the optical flow field. To address this issue, we propose the texture inpaint to synthesize the limited distorted region and make the stylized video temporally coherent. We also accelerate the video stylization by using the CUDA parallel computing framework that parallelly computes the morphological operations used for video abstraction. Finally, we produce stylized videos of multiple artistic styles with satisfactory experimental results, including the styles of oil painting, watercolor painting and stylized lines drawing.

关 键 词:non-photorealistic rendering video stylization texture synthesis direction field CUDA 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术] TP317.4[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象