Skip to main content

UpUpstream optimization of a neural net summary statistic with respect to downstream inference goals.

Project description

neos

neural nice end-to-end optimized statistics

Actions Status Documentation Status Code style: black

PyPI version PyPI platforms

DOI

Binder

neos logo

About

Run the binder demo :) -> Binder

Leverages the shoulders of giants (jax and pyhf) to differentiate through a high-energy physics analysis workflow, including the construction of the frequentist profile likelihood.

If you're more of a video person, see this talk given by Nathan on the broader topic of differentiable programming in high-energy physics, which also covers neos.

Example usage -- train a neural network to optimize an expected p-value

setup

In a python 3 environment, run the following:

pip install --upgrade pip setuptools wheel
pip install neos
pip install git+http://github.com/scikit-hep/pyhf.git@make_difffable_model_ctor

With this, you should be able to run the demo notebook demo.ipynb on your pc :)

This workflow is as follows:

  • From a set of normal distributions with different means, we'll generate four blobs of (x,y) points, corresponding to a signal process, a nominal background process, and two variations of the background from varying the background distribution's mean up and down.
  • We'll then feed these points into the previously defined neural network for each blob, and construct a histogram of the output using kernel density estimation. The difference between the two background variations is used as a systematic uncertainty on the nominal background.
  • We can then leverage the magic of pyhf to construct an event-counting statistical model from the histogram yields.
  • Finally, we calculate the p-value of a test between the nominal signal and background-only hypotheses. This uses the familiar profile likelihood-based test statistic.

This counts as one forward pass of the workflow -- we then optimize the neural network by gradient descent, backpropagating through the whole analysis!

Thanks

A big thanks to the teams behind jax, fax, jaxopt and pyhf for their software and support.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neos-0.3.0.tar.gz (3.1 MB view details)

Uploaded Source

Built Distribution

neos-0.3.0-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file neos-0.3.0.tar.gz.

File metadata

  • Download URL: neos-0.3.0.tar.gz
  • Upload date:
  • Size: 3.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for neos-0.3.0.tar.gz
Algorithm Hash digest
SHA256 7b30c44fbd388a3c2fd0cd614e66192a65824a270aac1735033fd01dc80cdeb8
MD5 72cc939c853ff97be25ed4a8bd75f951
BLAKE2b-256 63b2bb50a117f3d181090b29aae46205670420de4b612ad5004689df671f4d19

See more details on using hashes here.

File details

Details for the file neos-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: neos-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for neos-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 abd8bf07476dbdbb23d870fa5d2aa3eb3c8001815c0d10c9a06c0f2adf894818
MD5 a1ad3867fb161afe5eb14bdeb92e4adf
BLAKE2b-256 35f34bd54f218b8281c137c84cce152cc06da5a73b440f1e06f8774835804e75

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page