Skip to main content

A Python package for developing machine learning interatomic potentials, based on JAX.

Project description

https://img.shields.io/pypi/v/pantea.svg https://github.com/hghcomphys/pantea/actions/workflows/tests.yml/badge.svg Documentation Status

Description

Pantea is an optimized Python library based on Google JAX that enables development of machine learning interatomic potentials for use in computational material science. These potentials are particularly necessary for conducting large-scale molecular dynamics simulations of complex materials with ab initio accuracy.

See documentation for more information.

Key features

  • The design of Pantea is simple and flexible, which makes it easy to incorporate atomic descriptors and potentials.

  • It uses automatic differentiation to make defining new descriptors straightforward.

  • Pantea is written purely in Python and optimized with just-in-time (JIT) compilation.

  • It also supports GPU computing, which can significantly speed up preprocessing and model training.

Installation

To install Pantea, run this command in your terminal:

$ pip install pantea

For machines with an NVIDIA GPU please follow the installation instruction on the documentation.

Examples

I. Descriptor (ACSF)

Atom-centered Symmetry Function (ACSF) descriptor captures information about the distribution of neighboring atoms around a central atom by considering both radial (two-body) and angular (three-body) symmetry functions. The values obtained from these calculations represent a fingerprint of the local atomic environment and can be used in various machine learning potentials.

Script below demonstrates the process of defining multiple symmetry functions for an element, which can be utilized to evaluate the descriptor values for any structure.

from pantea.datasets import Dataset
from pantea.descriptors import ACSF
from pantea.descriptors.acsf import CutoffFunction, NeighborElements, G2, G3

# Read atomic structure dataset (e.g. water molecules)
structures = Dataset.from_runner("input.data")
structure = structures[0]
print(structure)
# >> Structure(natoms=12, elements=('H', 'O'), dtype=float64)

# Define an ACSF descriptor for hydrogen atoms
# It includes two radial (G2) and angular (G3) symmetry functions
cfn = CutoffFunction.from_type("tanh", r_cutoff=12.0)
g2 = G2(cfn, eta=0.5, r_shift=0.0)
g3 = G3(cfn, eta=0.001, zeta=2.0, lambda0=1.0, r_shift=12.0)

descriptor = ACSF(
        central_element='H',
        radial_symmetry_functions=(
                (g2, NeighborElements('H')),
        ),
        angular_symmetry_functions=(
                (g3, NeighborElements('H', 'O')),
        ),
)

print(descriptor)
# >> ACSF(central_element='H', num_symmetry_functions=2)

values = descriptor(structure)
print("Descriptor values:\n", values)
# >> Descriptor values:
# [[0.01952943 1.13103234]
#  [0.01952756 1.04312263]
# ...
#  [0.00228752 0.41445455]]

gradient = descriptor.grad(structure)
print("Descriptor gradient:\n", gradient)
# >> Descriptor gradient:
# [[[ 4.64523585e-02 -5.03786078e-02 -6.14621389e-02]
#   [-1.04818547e-01 -1.84170755e-02  4.76021411e-02]]
#  [[-9.67003098e-03 -5.45498827e-02  6.32422634e-03]
#   [-1.59613454e-01 -5.94085256e-02  1.72978932e-01]]
# ...
#  [[-1.36223042e-03 -8.02832759e-03 -6.08306094e-05]
#   [ 1.29199076e-02 -9.58762344e-03 -9.12714216e-02]]]

II. Potential (NNP)

This example illustrates how to quickly create a high-dimensional neural network potential (HDNNP) instance from an input setting file.

from pantea.datasets import Dataset
from pantea.potentials import NeuralNetworkPotential

# Dataset: reading structures from RuNNer input data file
structures = Dataset.from_runner("input.data")
structure = structures[0]

# Potential: creating a NNP from the RuNNer potential file
nnp = NeuralNetworkPotential.from_runner("input.nn")
nnp.load()  # this will require loading scaler and model parameter files.

total_energy = nnp(structure)
print(total_energy)

forces = nnp.compute_forces(structure)
print(forces)

Download example input files from here.

License

This project is licensed under the GNU General Public License (GPL) version 3 - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pantea-0.11.0.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

pantea-0.11.0-py2.py3-none-any.whl (67.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file pantea-0.11.0.tar.gz.

File metadata

  • Download URL: pantea-0.11.0.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for pantea-0.11.0.tar.gz
Algorithm Hash digest
SHA256 10e9b6179b0c41de4063a6c76e294ac1e0d8ecd8d1b8a50ae7af44c2b6815d67
MD5 0eee49ad054d720238d03a80b9d9a2cb
BLAKE2b-256 0e42f155a7152627e3990ec8746df201db6e5d0a8a22bb072711051670f51723

See more details on using hashes here.

File details

Details for the file pantea-0.11.0-py2.py3-none-any.whl.

File metadata

  • Download URL: pantea-0.11.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 67.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for pantea-0.11.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 3178838fb6729c8b3c0e867034b9a345dcd1b3ccb4acf7d5586812f620f5e67d
MD5 927619a0e575cb64e676b9d2555761e9
BLAKE2b-256 a5e232c6fda5ec582124f56b11e712170ab1628d499720ecf2ec4ca57ca3b814

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page