Skip to main content

OmniFold, a library to perform unbinned and high-dimensional unfolding for HEP.

Project description

OmniFold: A Method to Simultaneously Unfold All Observables

This repository contains the implementation and examples of the OmniFold algorithm originally described in Phys. Rev. Lett. 124 (2020) 182001, 1911.09107 [hep-ph]. The code for the original paper can be found at this repository, which includes a binder demo. This repository was created to maintain the pip installable version of OmniFold with additional functionality compared to the original package:

Installation

pip install omnifold

Getting Started

Examples for tabular data and for Point Cloud-like inputs are provided in the notebooks OmniFold_example.ipynb and OmniFold_example_pc.ipynb, respectively. To unfold your own data you can follow these steps:

Creating a DataLoader to hold your data

from omnifold import DataLoader

# Load your own dataset, possibly with weights.
mock_dataset = np.zeros((100,3,4))
mock_dataloader = DataLoader(
		  reco = mock_dataset,
		  gen = mock_dataset,
		  normalize=True)

The DataLoader class will automatically normalize the weights if normalize is true. To estimate the statistical uncertainty using the Bootstrap method, you can use the optional flag bootstrap = True.

Creating your own Keras model to be used for Unfolding

In the MultiFold class, we provide simple neural network models that you can use. For a Multilayer Perceptron you can load

from omnifold import MLP
ndim = 3 #The number of features present in your dataset
reco_model = MLP(ndim)
gen_model = MLP(ndim)

to create the models to be used at both reconstruction and generator level trainings of OmniFold. In case your data is better described by a point cloud, we also provide the implementation of the Point-Edge Transformer (PET) model that can be used similarly to the MLP implementation:

from omnifold import PET
ndim = 3 #The number of features present in your dataset
npart = 5 #Maximum number of particles present in the dataset

reco_model = PET(ndim,num_part = npart)
gen_model = PET(ndim,num_part = npart)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omnifold-0.1.23.tar.gz (14.2 kB view details)

Uploaded Source

Built Distribution

omnifold-0.1.23-py3-none-any.whl (15.2 kB view details)

Uploaded Python 3

File details

Details for the file omnifold-0.1.23.tar.gz.

File metadata

  • Download URL: omnifold-0.1.23.tar.gz
  • Upload date:
  • Size: 14.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for omnifold-0.1.23.tar.gz
Algorithm Hash digest
SHA256 cdb8e6dbfc80c1c2f93fb61458def07cab08dabbe5ed48ea0d4a1766bd386a44
MD5 b40c643c1731b0f9dc8a2b17d412ada0
BLAKE2b-256 8c639af2b344f60d195493669465e8f0250c02034437c37c79bd1ed2ea4139b9

See more details on using hashes here.

File details

Details for the file omnifold-0.1.23-py3-none-any.whl.

File metadata

  • Download URL: omnifold-0.1.23-py3-none-any.whl
  • Upload date:
  • Size: 15.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for omnifold-0.1.23-py3-none-any.whl
Algorithm Hash digest
SHA256 7bcaf79b766b3d514316f26863a360dab531744fa3f0f9489f705ca00c7cfd0d
MD5 dc1e59885e5f5c80e1626b781e289815
BLAKE2b-256 7184a9d141f3fc97d031548fdd5a22ba9e3501bcfd3beb50d6e9827a3840b6b0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page