Skip to main content

OmniFold, a library to perform unbinned and high-dimensional unfolding for HEP.

Project description

OmniFold: A Method to Simultaneously Unfold All Observables

This repository contains the implementation and examples of the OmniFold algorithm originally described in Phys. Rev. Lett. 124 (2020) 182001, 1911.09107 [hep-ph]. The code for the original paper can be found at this repository, which includes a binder demo. This repository was created to maintain the pip installable version of OmniFold with additional functionality compared to the original package:

Installation

pip install omnifold

Getting Started

Examples for tabular data and for Point Cloud-like inputs are provided in the notebooks OmniFold_example.ipynb and OmniFold_example_pc.ipynb, respectively. To unfold your own data you can follow these steps:

Creating a DataLoader to hold your data

from omnifold import DataLoader

# Load your own dataset, possibly with weights.
mock_dataset = np.zeros((100,3,4))
mock_dataloader = DataLoader(
		  reco = mock_dataset,
		  gen = mock_dataset,
		  normalize=True)

The DataLoader class will automatically normalize the weights if normalize is true. To estimate the statistical uncertainty using the Bootstrap method, you can use the optional flag bootstrap = True.

Creating your own Keras model to be used for Unfolding

In the MultiFold class, we provide simple neural network models that you can use. For a Multilayer Perceptron you can load

from omnifold import MLP
ndim = 3 #The number of features present in your dataset
reco_model = MLP(ndim)
gen_model = MLP(ndim)

to create the models to be used at both reconstruction and generator level trainings of OmniFold. In case your data is better described by a point cloud, we also provide the implementation of the Point-Edge Transformer (PET) model that can be used similarly to the MLP implementation:

from omnifold import PET
ndim = 3 #The number of features present in your dataset
npart = 5 #Maximum number of particles present in the dataset

reco_model = PET(ndim,num_part = npart)
gen_model = PET(ndim,num_part = npart)

You can also provide your own custom keras.Model to be used by OmniFold

Creating the MultiFold Object

Now that we have the dataset and models, we can create the MultiFold object that performs the unfolding and reweighting of new datasets

omnifold = MultiFold(
    "Name_of_experiment",
    reco_model,
    gen_model,
    data, # a dataloader instance containing the measured data
    mc , # a dataloader instance containing the simulation
)

The last step is to finally run the unfolding!

omnifold.Unfold()

Evaluating the Unfolded Results:

We can evaluate the reweighting function learned by OmniFold by using the reweight function

unfolded_weights  = omnifold.reweight(validation_data,omnifold.model2,batch_size=1000) 

These weights can be applied directly to the simulation used during the unfolding to produce the unfolded results.

Plotting the Results of the Unfolding

The omnifold package also provides a histogram functionality that you can use to plot histograms. You can use the plotting code as:

from omnifold import SetStyle, HistRoutine
SetStyle()

#Create a dictionary containing the data to plot
data_dict = {
    'Distribution A': data_a, 
    'Distribution B': data_b,
}
HistRoutine(data_dict,'Name of the feature to plot', reference_name = 'Name of the dataset to calculate the ratio plot')

The function will create the histograms for the datasets used as part of the inputs. Specifc binnings for the histograms can be passed as numpy arrays in the binning argument, or calculated directly by the routine.

Weights can be added to the histograms by passing an additional dictionary with the same key entries and the weights to be used for each distribution. For example:

weight_dict = {
    'Distribution A': weight_a, 
    'Distribution B': weight_b,
}
HistRoutine(data_dict,'Name of the feature to plot', reference_name = 'Name of the dataset to calculate the ratio plot', weights = weight_dict)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omnifold-0.1.31.tar.gz (16.6 kB view details)

Uploaded Source

Built Distribution

omnifold-0.1.31-py3-none-any.whl (17.9 kB view details)

Uploaded Python 3

File details

Details for the file omnifold-0.1.31.tar.gz.

File metadata

  • Download URL: omnifold-0.1.31.tar.gz
  • Upload date:
  • Size: 16.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for omnifold-0.1.31.tar.gz
Algorithm Hash digest
SHA256 5ed46e7f3ae4adfdc43b3010cdbd7e5e394fa3c845d7f6be50f183e8854f0579
MD5 14ae7a46d5bf9d15e9f5ab90b21e58f4
BLAKE2b-256 509a3c10a562dffe14835ba5a23b7aa0fd1bcada2139b1abf95801b9efd9f92e

See more details on using hashes here.

File details

Details for the file omnifold-0.1.31-py3-none-any.whl.

File metadata

  • Download URL: omnifold-0.1.31-py3-none-any.whl
  • Upload date:
  • Size: 17.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for omnifold-0.1.31-py3-none-any.whl
Algorithm Hash digest
SHA256 1320dd02a5762d4b587214dd0d28eb234f87d95d2794b1c3f97e217e19c00a05
MD5 c3231dee365b7e825d93b5b5127ddd5d
BLAKE2b-256 46dc4c6ff2c771d586e3a33e5090fe27554178d52558610d5f24d258a1d87356

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page