检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:Alice E.A.Allen Nicholas Lubbers Sakib Matin Justin Smith Richard Messerly Sergei Tretiak Kipton Barros
机构地区:[1]Center for Nonlinear Studies,Los Alamos National Laboratory,Los Alamos,NM,87545,USA [2]Theoretical Division,Los Alamos National Laboratory,Los Alamos,NM,87545,USA [3]Computer,Computational,and Statistical Sciences Division,Los Alamos National Laboratory,Los Alamos,NM,87545,USA [4]Nvidia Corporation,Santa Clara,CA,9505,USA [5]Center for Integrated Nanotechnologies,Los Alamos National Laboratory,Los Alamos,NM,87545,USA
出 处:《npj Computational Materials》2024年第1期1654-1662,共9页计算材料学(英文)
基 金:supported by the United States Department of Energy(US DOE),Office of Science,Basic Energy Sciences,Chemical Sciences,Geosciences,and Biosciences Division under Triad National Security,LLC(‘Triad’)contract grant no.89233218CNA000001(FWP:LANLE3F2);A.E.A.Allen and S.Matin also acknowledge the Center for Nonlinear Studies.Computer time was provided by the CCS-7 Darwin cluster at LANL.LAUR-23-27568.
摘 要:The development of machine learning models has led to an abundance of datasets containing quantum mechanical(QM)calculations for molecular and material systems.However,traditional training methods for machine learning models are unable to leverage the plethora of data available as they require that each dataset be generated using the same QM method.Taking machine learning interatomic potentials(MLIPs)as an example,we show that meta-learning techniques,a recent advancement from the machine learning community,can be used to fit multiple levels of QMtheory in the same training process.Meta-learning changes the training procedure to learn a representation that can be easily re-trained to new tasks with small amounts of data.We then demonstrate that metalearning enables simultaneously training to multiple large organic molecule datasets.As a proof of concept,we examine the performance of aMLIP refit to a small drug-like molecule and show that pretraining potentials to multiple levels of theory with meta-learning improves performance.This difference in performance can be seen both in the reduced error and in the improved smoothness of the potential energy surface produced.We therefore show that meta-learning can utilize existing datasets with inconsistentQMlevels of theory to producemodels that are better at specializing to new datasets.This opens new routes for creating pre-trained,foundationmodels for interatomic potentials.
关 键 词:LEARNING utilize SMOOTHNESS
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.1