Skip to main content

UNIQUE is a Python package for uncertainty quantification and benchmarking of ML models' predictions.

Project description

UNIQUE Logo
UNcertaInty QUantification bEnchmark: a Python library for benchmarking uncertainty estimation and quantification methods for Machine Learning models predictions.

Python version PyPI downloads License chemRxiv Docs Build

Introduction

UNIQUE provides methods for quantifying and evaluating the uncertainty of Machine Learning (ML) models predictions. The library allows to combine and benchmark multiple uncertainty quantification (UQ) methods simultaneously, generates intuitive visualizations, evaluates the goodness of the UQ methods against established metrics, and in general enables the users to get a comprehensive overview of their ML model's performances from an uncertainty quantification perspective.

UNIQUE is a model-agnostic tool, meaning that it does not depend on any specific ML model building platform or provides any ML model training functionality. It is lightweight, because it only requires the user to input their model's inputs and predictions.

UNIQUE High Level Schema
High-level schema of UNIQUE's components.

Installation

Python version PyPI downloads Build

UNIQUE is currently compatible with Python 3.8 through 3.12.1. To install the latest release and use the package as is, run the following in a compatible environment of choice:

pip install unique-uncertainty

Check out the docs for more installation instructions.

Getting Started

Check out the docs for a complete set of instructions on how to prepare your data and the possible configurations offered by UNIQUE.

Usage

Finally, once the data and configuration files have been prepared, you can run UNIQUE in the following way:

from unique import Pipeline

# Prepare UNIQUE pipeline
pipeline = Pipeline.from_config("/path/to/config.yaml")

# Run UNIQUE pipeline
uq_methods_outputs, uq_evaluation_outputs = pipeline.fit()
# Returns: (Dict[str, np.ndarray], Dict[str, pd.DataFrame])

Fitting the Pipeline will return two dictionaries:

  • uq_methods_outputs: contains each UQ method's name (as in "UQ_Method_Name[Input_Name(s)]") and computed UQ values.
  • uq_evaluation_outputs: contains, for each evaluation type (ranking-based, proper scoring rules, and calibration-based), the evaluation metrics outputs for all the corresponding UQ methods organized in pd.DataFrame.

Additionally, UNIQUE also generates graphical outputs in the form of tables and evaluation plots (if display_outputs is enabled and the code is running in a JupyterNotebook cell).

Examples

For more hands-on examples and detailed usage, check out some of the examples in the docs.

Deep Dive

Check out the docs for an in-depth overview of UNIQUE's concepts, functionalities, outputs, and references.

Contributing

Any and all contributions and suggestions from the community are more than welcome and highly appreciated. If you wish to help us out in making UNIQUE even better, please check out our contributing guidelines.

Please note that we have a Code of Conduct in place to ensure a positive and inclusive community environment. By participating in this project, you agree to abide by its terms.

License

UNIQUE is licensed under the BSD 3-Clause License. See the LICENSE file.

Cite Us

If you find UNIQUE helpful for your work and/or research, please consider citing our work:

@misc{lanini2024unique,
  title={UNIQUE: A Framework for Uncertainty Quantification Benchmarking},
  author={Lanini, Jessica and Huynh, Minh Tam Davide and Scebba, Gaetano and Schneider, Nadine and Rodr{\'\i}guez-P{\'e}rez, Raquel},
  year={2024},
  doi={https://doi.org/10.26434/chemrxiv-2024-fmbgk},
}

Contacts & Acknowledgements

For any questions or further details about the project, please get in touch with any of the following contacts:

Novartis Logo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unique_uncertainty-0.2.1.tar.gz (4.3 MB view details)

Uploaded Source

Built Distribution

unique_uncertainty-0.2.1-py3-none-any.whl (87.3 kB view details)

Uploaded Python 3

File details

Details for the file unique_uncertainty-0.2.1.tar.gz.

File metadata

  • Download URL: unique_uncertainty-0.2.1.tar.gz
  • Upload date:
  • Size: 4.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for unique_uncertainty-0.2.1.tar.gz
Algorithm Hash digest
SHA256 f9053ff251ac31b946302532c581326b2edfe212da0e0d1cdcf3585e23cfc64c
MD5 b19e819c3f35b3b4706eaab18522ec7d
BLAKE2b-256 3e9f3ca62172ccefad4f8514229c490a13554c4a8de61866f33810f9a148e7c4

See more details on using hashes here.

File details

Details for the file unique_uncertainty-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for unique_uncertainty-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b90a32849d40425d22c5aa898834db4ed5181cba6c4272396dc3cc20c0b18bbe
MD5 a6b125c3db41072df6743c66de84de81
BLAKE2b-256 25720b50575e16c6bf604ff7e10363bc1394ac85e6dfac95cbb66547c44f5ddb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page