Skip to main content

JAX-based Interatomic Potential

Project description

JAX-based Interatomic Potential

https://img.shields.io/pypi/v/jaxip.svg https://github.com/hghcomphys/jaxip/actions/workflows/python-app.yml/badge.svg Documentation Status

Description

Jaxip is an optimized Python library on basis of Google JAX that enables development of emerging machine learning interatomic potentials for use in computational physics, chemistry, material science. These potentials are necessary for conducting large-scale molecular dynamics simulations of complex materials with ab initio accuracy.

See documentation for more information.

Main features

  • The design of Jaxip is simple and flexible, which makes it easy to incorporate atomic descriptors and potentials.

  • It uses autograd to make defining new descriptors straightforward.

  • Jaxip is written purely in Python and optimized with just-in-time (JIT) compilation.

  • It also supports GPU-accelerated computing, which can significantly speed up preprocessing and model training.

Installation

To install Jaxip, run this command in your terminal:

$ pip install jaxip

For machines with an NVIDIA GPU please follow the installation instruction on the documentation.

Examples

Defining an atomic descriptor

This script demonstrates the process of evaluating an array of atomic-centered symmetry functions (ACSF) for a specific element, which can be utilized to evaluate the descriptor values for any structure. The resulting values can then be used to construct a machine learning potential.

from jaxip.datasets import RunnerDataset
from jaxip.descriptors import ACSF
from jaxip.descriptors.acsf import CutoffFunction, G2, G3

# Read atomic structure dataset (e.g. water molecules)
structures = RunnerDataset('input.data')
structure = structures[0]

# Define ACSF descriptor for hydrogen element
descriptor = ACSF(element='H')

# Add radial and angular symmetry functions
cfn = CutoffFunction(r_cutoff=12.0, cutoff_type='tanh')
descriptor.add( G2(cfn, eta=0.5, r_shift=0.0), 'H')
descriptor.add( G3(cfn, eta=0.001, zeta=2.0, lambda0=1.0, r_shift=12.0), 'H', 'O')

# Compute descriptor values
descriptor(structure)

# Compute gradient
descriptor.grad(structure, atom_index=0)

Training a machine learning potential

This example illustrates how to quickly create a high-dimensional neural network potential (HDNNP) instance from an in input setting files and train it on input structures. The trained potential can then be used to evaluate the energy and force components for new structures.

from jaxip.datasets import RunnerDataset
from jaxip.potentials import NeuralNetworkPotential

# Read atomic data
structures = RunnerDataset("input.data")
structure = structures[0]

# Instantiate potential from input settings file
nnp = NeuralNetworkPotential.create_from_file("input.nn")

# Fit descriptor scaler and model weights
nnp.fit_scaler(structures)
nnp.fit_model(structures)
nnp.save()

# Or loading from files
#nnp.load()

# Total energy
nnp(structure)

# Force components
nnp.compute_force(structure)

License

This project is licensed under the GNU General Public License (GPL) version 3 - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jaxip-0.5.6.tar.gz (273.7 kB view hashes)

Uploaded Source

Built Distribution

jaxip-0.5.6-py2.py3-none-any.whl (56.6 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page