Skip to content

Create, use, and analyze machine learning potentials within the many-body expansion framework.

License

Notifications You must be signed in to change notification settings

keithgroup/mbGDML

Repository files navigation

mbGDML

Create, use, and analyze machine learning potentials within the many-body expansion framework.

Build Status Codecov DOI License Repo size Black style Black style

MotivationApproachFeaturesInstallationLicense

Motivation

Machine learning potentials (i.e., force fields) often rely on local descriptors for size transferability. These descriptors partition total properties into atomic contributions; however, they inherently neglect complicated long-range interactions by enforcing atomic radial cutoffs. Global descriptors encode the entire structure with no cutoffs and can capture interactions at all scales. However, they are restricted to systems with the same number of atoms.

Gradient-domain machine learning (GDML) is one example of a ML potential with a global descriptor. GDML is unique because it trains directly on forces and recovers total energy through analytical integration. This provides substantially more information about the potential energy surface (PES) and allows for better interpolation between training data. As a result, GDML typically only needs 1000 structures to accurately learn energies and forces.

To date, GDML has been limited to the exact system it was trained on. This makes simulations on arbitrarily size systems, like solvents, futile.

Approach

Many-body expansions (MBEs) rigorously decomposes total (i.e., supersystem) energies into fundamental n-body interactions. This expansion is formally exact when all N-body interactions are accounted for. In practice, however, it is typically truncated to the third order. One can then model any system by summing up 1-, 2-, and 3-body contributions.

MBEs driven by GDML potentials trained on n-body interactions is a promising approach for size-transferable potentials. Furthermore, GDML model's remarkable data efficiency enables training on highly accurate quantum chemical methods.

Features

Train

  • Train GDML models using grid searches, Bayesian optimization, or both on CPUs.
  • Custom loss functions.
  • Iterative training procedure for automated curation of optimal training sets.

Predict

  • Many-body predictions with GDML, SchNet and GAP potentials.
  • Parallel GDML predictions with ray from a laptop to multiple nodes.
  • Periodic structures with the minimum-image convention.
  • Alchemical predictions by tuning out 2- or 3-body contributions of specific entities.

Analysis

  • Prediction sets that store decomposed predictions for further analysis.
  • Radial distribution functions.
  • Cluster and identify problematic (i.e., high error) structures using sklearn.

Interfaces

Installation

You can install mbGDML from PyPI by using pip install mbGDML. Or, the latest development version can be installed directly from the GitHub repository or from TestPyPI.

git clone https://github.com/keithgroup/mbGDML
cd mbGDML
pip install .

Citing this work

If you find this code helpful in your research or project, please consider citing the following paper:

Maldonado, A. M.; Poltavsky, I.; Vassilev-Galindo, V.; Tkatchenko, A.; Keith, J. A. Modeling molecular ensembles with gradient-domain machine learning force fields. Digital Discovery 2023, 2 (3), 871-880. DOI: 10.1039/D3DD00011G.

@article{maldonado2023modeling,
  title={Modeling molecular ensembles with gradient-domain machine learning force fields},
  author={Maldonado, Alex M and Poltavsky, Igor and Vassilev-Galindo, Valentin and Tkatchenko, Alexandre and Keith, John A},
  journal={Digital Discovery},
  volume={2},
  number={3},
  pages={871--880},
  year={2023},
  publisher={Royal Society of Chemistry},
  doi={10.1039/D3DD00011G}
}

Citing the paper helps acknowledge the effort put into developing and maintaining this codebase, and it provides a way to support further research and development. Thank you for your support!

License

Distributed under the MIT License. See LICENSE for more information.