Skip to main content

Subset selection with maximum diversity.

Project description

BFit

This project supports Python 3.9+ GitHub Actions CI Tox Status GPLv3 License Binder

BFit is a Python library for fitting a convex sum of Gaussian functions to any probability distribution. It is primarily intended for quantum chemistry applications, where the basis functions are Gaussians and the fitted probability distribution is a scalar function like the electron density.

See the example section down below or the interactive Jupyter binder or various files in the example folder to see specific examples on how to fit using the different algorithms and objective functions. For further information about the api, please visit BFit Documentation.

The instructions to access the results of the fitted atomic densities using KL-FI method is shown in the section below.

To report any issues or ask questions, either open an issue or email qcdevs@gmail.com.

Citation

Please use the following citation in any publication using BFit library:

Tehrani, A., M. Anderson, J. S., Chakraborty, D., Rodriguez-Hernandez, J. I., Thompson, D. C., Verstraelen, T., Ayers, P. W., & Heidar-Zadeh, F. An information-theoretic approach to basis-set fitting of electron densities and other non-negative functions. Journal of Computational Chemistry. https://doi.org/10.1002/jcc.27170

Dependencies

Installation

Three options to install BFit:

# install from source
git clone https://github.com/theochem/bfit.git
pip install .

 # or install using conda.
conda install -c theochem qc-bfit

# or install using pip.
pip install qc-bfit

# run tests to make sure BFit was installed properly
pytest -v .

Features

The features of this software are:

  • Gaussian Basis set model:

    • Construct s-type and p-type Gaussian functions,
    • Compute Atomic Densities or Molecular Densities.
  • Fitting measures:

    • Least-squares,
    • Kullback-Leibler divergence,
    • Tsallis divergence.
  • Optimization procedures

    • Optimize using SLSQP in "scipy.minimize" procedures.
    • Optimize Kullback-Leibler using self-consistent iterative method see paper.
    • Greedy method for optimization of Kullback-Leibler and Least-Squares, see paper.
  • Read/Parse Hatree-Fock wavefunctions for atomic systems:

    • Includes: anions, cations and heavy elements, see data page.
    • Compute:
      • Atomic density, including core, and valence densities,
      • Positive definite kinetic energy density.

Final Models of Fitting Atomic Densities

The final model of fitting the atomic densities using the Kullback-Leibler (KL) divergence fixed point iteration method can be accessed by opening the file ./bfit/data/kl_fpi_results.npz with numpy. Similarly, the results from optimizing KL with SLSQP method using kl_fpi_results.npz as initial guesses can be accessed by opening the file ./bfit/data/kl_slsqp_results.npz with numpy. In general, we recommend KL-SLSQP results over the KL-FPI results.

import numpy as np

element = "be"
results = np.load("./bfit/data/kl_fpi_results.npz")
num_s = results["be_num_s"]  # Number of s-type Gaussian function
num_p = results["be_num_p"]  # Number of p-type Gaussian functions
coeffcients = results["be_coeffs"]
exponents = results["be_exps"]

print("s-type exponents")
print(exponents[:num_s])
print("p-type exponents")
print(exponents[num_s:])

Alternatively, one can load these results using JSON file.

import json
import numpy as np

element = "be"
with open("./bfit/data/kl_fpi_results.json") as file:
    data = json.load(file)
    data_element = data[element]

    num_s = data_element["num_s"]
    num_p = data_element["num_p"]
    coeffcients = np.array(data_element["coeffs"])
    exponents = np.array(data_element["exps"])

Evaluation of the normalized Gaussian model at a given set of points can also be computed

from bfit.grid import ClenshawRadialGrid
from bfit.model import AtomicGaussianDensity

grid = ClenshawRadialGrid(4, num_core_pts=10000, num_diffuse_pts=899, extra_pts=[50, 75, 100])
model = AtomicGaussianDensity(grid.points, num_s=num_s, num_p=num_p, normalize=True)
model_pts = model.evaluate(coefficients, exponents)

print("Numerical integral (spherically) of the model %f." %
      grid.integrate(model_pts * 4.0 * np.pi * grid.points**2.0)
)

Example

There are four steps to using BFit.

1. Specify the Grid Object.

The grid is a uniform one-dimension grid with 100 points from 0. to 50.

import numpy as np
from bfit.grid import UniformRadialGrid
grid = UniformRadialGrid(num_pts=100, min_radii=0., max_radii=50.)

See grid.py, for different assortment of grids.

2. Specify the Model Object.

Here, the model distribution is 5 s-type, normalized Gaussian functions with center at the origin.

from bfit.model import AtomicGaussianDensity
model = AtomicGaussianDensity(grid.points, num_s=5, num_p=0, normalize=True)

See model.py for more options of Gaussian models.

3. Specify error measure.

The algorithm is fitted based on the paper.

from bfit.fit import KLDivergenceFPI

# What you want fitted to should also be defined on `grid.points`.
density = np.array([...])
fit = KLDivergenceFPI(grid, density, model)

See fit.py for options of fitting algorithms.

4. Run the optimization procedure.

Initial guesses for the coefficients and exponents of the 5 s-type Gaussians must be provided.

# Provide Initial Guesses
c0 = np.array([1., 1., 1., 1.])
e0 = np.array([0.001, 0.1, 1., 5., 100.])

# Optimize both coefficients and exponents and print while running.
result = fit.run(c0, e0, opt_coeffs=True, opt_expons=True, maxiter=1000, disp=True)

print("Was it successful? ", result["success"])
print("Optimized coefficients are: ", result["coeffs"])
print("Optimized exponents are: ", result["exps"])
print("Final performance measures are: ", result["fun"][-1])

See the example directory for more examples or launch the interactive binder Binder

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qc_bfit-0.0.1a2.tar.gz (695.6 kB view details)

Uploaded Source

Built Distribution

qc_BFit-0.0.1a2-py3-none-any.whl (520.5 kB view details)

Uploaded Python 3

File details

Details for the file qc_bfit-0.0.1a2.tar.gz.

File metadata

  • Download URL: qc_bfit-0.0.1a2.tar.gz
  • Upload date:
  • Size: 695.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for qc_bfit-0.0.1a2.tar.gz
Algorithm Hash digest
SHA256 008cf2d3b660a17f4833a5ea9d1cc2bc9e4a2086788838214a2ec3efb9da2460
MD5 60a1b1481f175f216b0f7c979e324bcd
BLAKE2b-256 2ef512ec0238014c271800bc3a7c62ebc3dc599427a9bd71d7e21718a5b31581

See more details on using hashes here.

Provenance

The following attestation bundles were made for qc_bfit-0.0.1a2.tar.gz:

Publisher: pypi_release.yaml on theochem/BFit

Attestations:

File details

Details for the file qc_BFit-0.0.1a2-py3-none-any.whl.

File metadata

  • Download URL: qc_BFit-0.0.1a2-py3-none-any.whl
  • Upload date:
  • Size: 520.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for qc_BFit-0.0.1a2-py3-none-any.whl
Algorithm Hash digest
SHA256 81c5662d3164d9a68ad7abf378387ecf3e479ee72520731c582ec6819586e148
MD5 503cb0a07c9f1a4cf5b238860dd10bcd
BLAKE2b-256 9e848de92846c8d8e53fce96bc0d0d0f1f71c116b236b31578e97c7b6fe37a47

See more details on using hashes here.

Provenance

The following attestation bundles were made for qc_BFit-0.0.1a2-py3-none-any.whl:

Publisher: pypi_release.yaml on theochem/BFit

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page