Variational inference in neural functional prior using normalizing flows: application to differential equation and operator learning problems  

在线阅读下载全文

作  者:Xuhui MENG 

机构地区:[1]Institute of Interdisciplinary Research for Mathematics and Applied Science,School of Mathematics and Statistics,Huazhong University of Science and Technology,Wuhan 430074,China

出  处:《Applied Mathematics and Mechanics(English Edition)》2023年第7期1111-1124,共14页应用数学和力学(英文版)

基  金:Project supported by the National Natural Science Foundation of China(No.12201229)。

摘  要:Physics-informed deep learning has recently emerged as an effective tool for leveraging both observational data and available physical laws.Physics-informed neural networks(PINNs)and deep operator networks(DeepONets)are two such models.The former encodes the physical laws via the automatic differentiation,while the latter learns the hidden physics from data.Generally,the noisy and limited observational data as well as the over-parameterization in neural networks(NNs)result in uncertainty in predictions from deep learning models.In paper“MENG,X.,YANG,L.,MAO,Z.,FERRANDIS,J.D.,and KARNIADAKIS,G.E.Learning functional priors and posteriors from data and physics.Journal of Computational Physics,457,111073(2022)”,a Bayesian framework based on the generative adversarial networks(GANs)has been proposed as a unified model to quantify uncertainties in predictions of PINNs as well as DeepONets.Specifically,the proposed approach in“MENG,X.,YANG,L.,MAO,Z.,FERRANDIS,J.D.,and KARNIADAKIS,G.E.Learning functional priors and posteriors from data and physics.Journal of Computational Physics,457,111073(2022)”has two stages:(i)prior learning,and(ii)posterior estimation.At the first stage,the GANs are utilized to learn a functional prior either from a prescribed function distribution,e.g.,the Gaussian process,or from historical data and available physics.At the second stage,the Hamiltonian Monte Carlo(HMC)method is utilized to estimate the posterior in the latent space of GANs.However,the vanilla HMC does not support the mini-batch training,which limits its applications in problems with big data.In the present work,we propose to use the normalizing flow(NF)models in the context of variational inference(VI),which naturally enables the mini-batch training,as the alternative to HMC for posterior estimation in the latent space of GANs.A series of numerical experiments,including a nonlinear differential equation problem and a 100-dimensional(100D)Darcy problem,are conducted to demonstrate that the NFs with full-/mini-batch training

关 键 词:uncertainty quantification(UQ) physics-informed neural network(PINN) 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程] O175[自动化与计算机技术—控制科学与工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象