Skip to main content

MatGL (Materials Graph Library) is a framework for graph deep learning for materials science.

Project description

GitHub license Linting Testing Downloads

Table of Contents

Introduction

MatGL (Materials Graph Library) is a graph deep learning library for materials. Mathematical graphs are a natural representation for a collection of atoms (e.g., molecules or crystals). Graph deep learning models have been shown to consistently deliver exceptional performance as surrogate models for the prediction of materials properties.

In this repository, we have reimplemented the MatErials 3-body Graph Network (m3gnet) and its predecessor, MEGNet using the Deep Graph Library (DGL). The goal is to improve the usability, extensibility and scalability of these models. The original M3GNet and MEGNet were implemented in TensorFlow.

This effort is a collaboration between the Materials Virtual Lab and Intel Labs (Santiago Miret, Marcel Nassar, Carmelo Gonzales).

Status

  • Apr 26 2023: Pre-trained MEGNet models now available for formation energies and band gaps!
  • Feb 16 2023: Both initial implementations of M3GNet and MEGNet architectures have been completed. Expect bugs!

Architectures

MEGNet

The MatErials Graph Network (MEGNet) is an implementation of DeepMind's graph networks for universal machine learning in materials science. We have demonstrated its success in achieving very low prediction errors in a broad array of properties in both molecules and crystals (see "Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals"). New releases have included our recent work on multi-fidelity materials property modeling (See "Learning properties of ordered and disordered materials from multi-fidelity data").

Briefly, Figure 1 shows the sequential update steps of the graph network, whereby bonds, atoms, and global state attributes are updated using information from each other, generating an output graph.

M3GNet

M3GNet is a new materials graph neural network architecture that incorporates 3-body interactions in MEGNet. An additional difference is the addition of the coordinates for atoms and the 3×3 lattice matrix in crystals, which are necessary for obtaining tensorial quantities such as forces and stresses via auto-differentiation.

As a framework, M3GNet has diverse applications, including:

  • Interatomic potential development. With the same training data, M3GNet performs similarly to state-of-the-art machine learning interatomic potentials (ML-IAPs). However, a key feature of a graph representation is its flexibility to scale to diverse chemical spaces. One of the key accomplishments of M3GNet is the development of a universal IAP that can work across the entire periodic table of the elements by training on relaxations performed in the Materials Project.
  • Surrogate models for property predictions. Like the previous MEGNet architecture, M3GNet can be used to develop surrogate models for property predictions, achieving in many cases accuracies that better or similar to other state-of-the-art ML models.

For detailed performance benchmarks, please refer to the publication in the References section.

Installation

Matgl can be installed via pip for the latest stable version:

pip install matgl

For the latest dev version, please clone this repo and install using:

python setup.py -e .

Usage

The pre-trained MEGNet models for the Materials Project formation energy and multi-fidelity band gap are now available. The following is an example of a prediction of the formation energy for CsCl.

from pymatgen.core import Structure, Lattice
from matgl.models._megnet import MEGNet

# load the pre-trained MEGNet model for formation energy model.
model = MEGNet.load("MEGNet-MP-2018.6.1-Eform")
# This is the structure obtained from the Materials Project.
struct = Structure.from_spacegroup("Pm-3m", Lattice.cubic(4.14), ["Cs", "Cl"], [[0, 0, 0], [0.5, 0.5, 0.5]])
eform = model.predict_structure(struct)
print(f"The predicted formation energy for CsCl is {float(eform.numpy()):5f} eV/atom.")

A full example is in here.

Additional information

References

Please cite the following works:

  • MEGNet
    Chen, C.; Ye, W.; Zuo, Y.; Zheng, C.; Ong, S. P. Graph Networks as a Universal Machine Learning Framework for
    Molecules and Crystals. Chem. Mater. 2019, 31 (9), 3564–3572. https://doi.org/10.1021/acs.chemmater.9b01294.
    
  • Multi-fidelity MEGNet
    Chen, C.; Zuo, Y.; Ye, W.; Li, X.; Ong, S. P. Learning Properties of Ordered and Disordered Materials from
    Multi-Fidelity Data. Nature Computational Science 2021, 1, 46–53. https://doi.org/10.1038/s43588-020-00002-x.
    
  • M3GNet
    Chen, C., Ong, S.P. A universal graph deep learning interatomic potential for the periodic table. Nat Comput Sci,
    2, 718–728 (2022). https://doi.org/10.1038/s43588-022-00349-3.
    

Acknowledgements

This work was primarily supported by the Materials Project, funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division under contract no. DE-AC02-05-CH11231: Materials Project program KC23MP. This work used the Expanse supercomputing cluster at the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1548562.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

matgl-0.3.0.tar.gz (169.0 kB view hashes)

Uploaded Source

Built Distribution

matgl-0.3.0-py3-none-any.whl (172.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page