Efficient Top-Down Parameterization of Machine Learning-Based Models
Presenter
DescriptionMolecular modeling has become a cornerstone of many disciplines, including material science. However, the quality of predictions critically depends on the employed potential energy model. A class of models with tremendous success in recent years are neural network (NN) potentials due to their flexibility and capacity of learning many-body interactions. Traditionally, these models are trained bottom-up on quantum mechanical data. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational challenges when backpropagating through molecular dynamics (MD) simulations. We recently developed the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent observables. Leveraging thermodynamic perturbation theory, we avoid exploding gradients and achieve around 2 orders of magnitude speed-up in gradient computation for top-down learning. The effectiveness of DiffTRe is showcased on atomistic and coarse-grained models using diverse experimental observables including thermodynamic, structural, and mechanical properties. Our approach opens the way to high fidelity molecular models, particularly when bottom-up data is unavailable or insufficiently accurate.
TimeTuesday, June 2811:00 - 11:30 CEST
LocationNairobi Room
Session Chair
Event Type
Minisymposium
Chemistry and Materials
Physics