Skip to main content

scalable pythonic model fitting for high energy physics

Project description

zfit logo

zfit: scalable pythonic fitting

https://scikit-hep.org/assets/images/Scikit--HEP-Affiliated-blue.svg DOI 10.1016/j.softx.2020.100508 PyPI - Python Version https://img.shields.io/pypi/v/zfit.svg conda-forge https://img.shields.io/spack/v/py-zfit https://github.com/zfit/zfit/workflows/build/badge.svg

zfit is a highly scalable and customizable model manipulation and likelihood fitting library. It uses the same computational backend as TensorFlow and is optimised for simple and direct manipulation of probability density functions. The project is affiliated with and well integrated into Scikit-HEP, the HEP Python ecosystem.

If you use zfit in research, please consider citing.

N.B.: zfit is currently in beta stage, so while most core parts are established, some may still be missing and bugs may be encountered. It is, however, mostly ready for production, and is being used in analyses projects. If you want to use it for your project and you are not sure if all the needed functionality is there, feel free to contact us.

Installation

zfit is available on pip and conda-forge. To install it (recommended: use a virtual/conda env!) with all the dependencies (minimizers, uproot, …), use

pip install -U zfit[all]

(the -U just indicates to upgrade zfit, in case you have it already installed) or for minimal dependencies

pip install zfit

For conda/mamba, use

conda install -c conda-forge zfit

How to use

While the zfit library provides a model fitting and sampling framework for a broad list of applications, we will illustrate its main features with a simple example by fitting a Gaussian distribution with an unbinned likelihood fit and a parameter uncertainty estimation.

Example in short

obs = zfit.Space('x', -10, 10)

# create the model
mu    = zfit.Parameter("mu"   , 2.4, -1, 5)
sigma = zfit.Parameter("sigma", 1.3,  0, 5)
gauss = zfit.pdf.Gauss(obs=obs, mu=mu, sigma=sigma)

# load the data
data_np = np.random.normal(size=10000)
data = zfit.Data(obs=obs, data=data_np)
# or sample from model
data = gauss.sample()

# build the loss
nll = zfit.loss.UnbinnedNLL(model=gauss, data=data)

# minimize (20+ interchangeable minimizers available!)
minimizer = zfit.minimize.Minuit()
result = minimizer.minimize(nll).update_params()

# calculate errors
sym_errors = result.hesse()
asym_errors = result.errors()

This follows the zfit workflow

zfit workflow

Full explanation

The default space (e.g. normalization range) of a PDF is defined by an observable space, which is created using the zfit.Space class:

obs = zfit.Space('x', -10, 10)

To create a simple Gaussian PDF, we define its parameters and their limits using the zfit.Parameter class.

# syntax: zfit.Parameter("any_name", value, lower, upper)
  mu    = zfit.Parameter("mu"   , 2.4, -1, 5)
  sigma = zfit.Parameter("sigma", 1.3,  0, 5)
  gauss = zfit.pdf.Gauss(obs=obs, mu=mu, sigma=sigma)

For simplicity, we create the dataset to be fitted starting from a numpy array, but zfit allows for the use of other sources such as ROOT files:

mu_true = 0
sigma_true = 1
data_np = np.random.normal(mu_true, sigma_true, size=10000)
data = zfit.Data(obs=obs, data=data_np)

Fits are performed in three steps:

  1. Creation of a loss function, in our case a negative log-likelihood.

  2. Instantiation of our minimiser of choice, in the example the Minuit.

  3. Minimisation of the loss function.

# Stage 1: create an unbinned likelihood with the given PDF and dataset
nll = zfit.loss.UnbinnedNLL(model=gauss, data=data)

# Stage 2: instantiate a minimiser (in this case a basic minuit)
minimizer = zfit.minimize.Minuit()

# Stage 3: minimise the given negative log-likelihood
result = minimizer.minimize(nll).update_params()

The .update_params() changes the default values of the parameters (this is currently happen by default but won’t anymore in the future)

Symmetric errors are calculated with a further function call to avoid running potentially expensive operations if not needed. Asymmetric errors using a profiling method can also be obtained:

sym_errors = result.hesse()
asym_errors = result.errors()

Once we’ve performed the fit and obtained the corresponding uncertainties, we can examine the fit results by printing it or looking at individual parts

print(result)  # nice representation of a whole result

print("Function minimum:", result.fmin)
print("Converged:", result.converged)

# Information on all the parameters in the fit
params = result.params
print(params)

# Printing information on specific parameters, e.g. mu
print("mu={}".format(params[mu]['value']))

And that’s it! For more details and information of what you can do with zfit, checkout the latest documentation.

Why?

The basic idea behind zfit is to offer a Python oriented alternative to the very successful RooFit library from the ROOT data analysis package that can integrate with the other packages that are part if the scientific Python ecosystem. Contrary to the monolithic approach of ROOT/RooFit, the aim of zfit is to be light and flexible enough t o integrate with any state-of-art tools and to allow scalability going to larger datasets.

These core ideas are supported by two basic pillars:

  • The skeleton and extension of the code is minimalist, simple and finite: the zfit library is exclusively designed for the purpose of model fitting and sampling with no attempt to extend its functionalities to features such as statistical methods or plotting.

  • zfit is designed for optimal parallelisation and scalability by making use of TensorFlow as its backend. The use of TensorFlow provides crucial features in the context of model fitting like taking care of the parallelisation and analytic derivatives.

Prerequisites

zfit works with Python versions 3.9 and above. The main dependency is tensorflow: zfit follows a close version compatibility with TensorFlow.

For a full list of all dependencies, check the requirements.

Contributing

Any idea of how to improve the library? Or interested to write some code? Contributions are always welcome, please have a look at the Contributing guide.

Contact

You can contact us directly:

Original Authors

Jonas Eschle <jonas.eschle@cern.ch>
Albert Puig <albert.puig@cern.ch>
Rafael Silva Coutinho <rsilvaco@cern.ch>

See here for all authors and contributors

Acknowledgements

zfit has been developed with support from the University of Zurich and the Swiss National Science Foundation (SNSF) under contracts 168169 and 174182.

The idea of zfit is inspired by the TensorFlowAnalysis framework developed by Anton Poluektov and TensorProb by Chris Burr and Igor Babuschkin using the TensorFlow open source library and more libraries.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zfit-0.23.0.tar.gz (2.9 MB view details)

Uploaded Source

Built Distribution

zfit-0.23.0-py2.py3-none-any.whl (2.8 MB view details)

Uploaded Python 2 Python 3

File details

Details for the file zfit-0.23.0.tar.gz.

File metadata

  • Download URL: zfit-0.23.0.tar.gz
  • Upload date:
  • Size: 2.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for zfit-0.23.0.tar.gz
Algorithm Hash digest
SHA256 19ec469e1703bd38f8b8957871851ee22fa2e68f0a57b7867cc40ea77df98cc5
MD5 51599e35f19720f8413a5d300dc90482
BLAKE2b-256 ea004d0c8b247999a892140bd841351502797dbc4b95a8a3ecc65c8fb0e1c900

See more details on using hashes here.

Provenance

The following attestation bundles were made for zfit-0.23.0.tar.gz:

Publisher: cd.yml on zfit/zfit

Attestations:

File details

Details for the file zfit-0.23.0-py2.py3-none-any.whl.

File metadata

  • Download URL: zfit-0.23.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for zfit-0.23.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 da627cbe7f4af04a97ed2c6190f5a683b5586b6815d83a8786ccfb0ed450b0a3
MD5 30f1bad8c3cded08a8363b0c0282e23d
BLAKE2b-256 1ef56b561d4d53b421e65a32220dfe6f61cdf74903280b56de00cba644d85c71

See more details on using hashes here.

Provenance

The following attestation bundles were made for zfit-0.23.0-py2.py3-none-any.whl:

Publisher: cd.yml on zfit/zfit

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page