AOGAN:A generative adversarial network for screen space ambient occlusion  被引量:2

在线阅读下载全文

作  者:Lei Ren Ying Song 

机构地区:[1]Zhejiang Sci-Tech University,Hangzhou 310018,China [2]2011 Collaborative Innovation Center for Garment Personal Customization of Zhejiang Province,China [3]Key Lab of Silk and Culture Heritage and Products Design Digital Technology,Ministry of Culture and Tourism,China

出  处:《Computational Visual Media》2022年第3期483-494,共12页计算可视媒体(英文版)

基  金:National Natural Science Foundation of China(No.61602416);Shaoxing Science and Technology Bureau Key Project(No.2020B41006);Opening Fund(No.2020WLB10)of the Key Laboratory of Silk Culture Heritage and Product Design Digital Technology。

摘  要:Ambient occlusion(AO)is a widely-used real-time rendering technique which estimates light intensity on visible scene surfaces.Recently,a number of learning-based AO approaches have been proposed,which bring a new angle to solving screen space shading via a unified learning framework with competitive quality and speed.However,most such methods have high error for complex scenes or tend to ignore details.We propose an end-to-end generative adversarial network for the production of realistic AO,and explore the importance of perceptual loss in the generative model to AO accuracy.An attention mechanism is also described to improve the accuracy of details,whose effectiveness is demonstrated on a wide variety of scenes.

关 键 词:ambient occlusion(AO) attention mechanism generative adversarial network(GAN) perceptual loss 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程] TP391.41[自动化与计算机技术—控制科学与工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象