Skip to main content

A scikit-learn-compatible module for estimating prediction intervals.

Project description

GitHubActions Codecov Documentation Status License PythonVersion PyPi Conda Release Commits DOI

https://github.com/scikit-learn-contrib/MAPIE/raw/master/doc/images/mapie_logo_nobg_cut.png

MAPIE - Model Agnostic Prediction Interval Estimator

MAPIE is an open-source Python library for quantifying uncertainties and controlling the risks of machine learning models. It is a scikit-learn-contrib project that allows you to:

  • Easily compute conformal prediction intervals (or prediction sets) with controlled (or guaranteed) marginal coverage rate for regression [3,4,8], classification (binary and multi-class) [5-7] and time series [9].

  • Easily control risks of more complex tasks such as multi-label classification, semantic segmentation in computer vision (probabilistic guarantees on recall, precision, …) [10-12].

  • Easily wrap any model (scikit-learn, tensorflow, pytorch, …) with, if needed, a scikit-learn-compatible wrapper for the purposes just mentioned.

Here’s a quick instantiation of MAPIE models for regression and classification problems related to uncertainty quantification (more details in the Quickstart section):

# Uncertainty quantification for regression problem
from mapie.regression import MapieRegressor
mapie_regressor = MapieRegressor(estimator=regressor, method='plus', cv=5)
# Uncertainty quantification for classification problem
from mapie.classification import MapieClassifier
mapie_classifier = MapieClassifier(estimator=classifier, method='score', cv=5)

Implemented methods in MAPIE respect three fundamental pillars:

  • They are model and use case agnostic,

  • They possess theoretical guarantees under minimal assumptions on the data and the model,

  • They are based on peer-reviewed algorithms and respect programming standards.

MAPIE relies notably on the field of Conformal Prediction and Distribution-Free Inference.

🔗 Requirements

  • MAPIE runs on Python 3.7+.

  • MAPIE stands on the shoulders of giants. Its only internal dependencies are scikit-learn and numpy=>1.21.

🛠 Installation

MAPIE can be installed in different ways:

$ pip install mapie  # installation via `pip`
$ conda install -c conda-forge mapie  # or via `conda`
$ pip install git+https://github.com/scikit-learn-contrib/MAPIE  # or directly from the github repository

⚡ Quickstart

Here we propose two basic uncertainty quantification problems for regression and classification tasks with scikit-learn.

As MAPIE is compatible with the standard scikit-learn API, you can see that with just these few lines of code:

  • How easy it is to wrap your favorite scikit-learn-compatible model around your model.

  • How easy it is to follow the standard sequential fit and predict process like any scikit-learn estimator.

# Uncertainty quantification for regression problem
import numpy as np
from sklearn.linear_model import LinearRegression
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split

from mapie.regression import MapieRegressor


X, y = make_regression(n_samples=500, n_features=1)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5)

regressor = LinearRegression()

mapie_regressor = MapieRegressor(estimator=regressor, method='plus', cv=5)

mapie_regressor = mapie_regressor.fit(X_train, y_train)
y_pred, y_pis = mapie_regressor.predict(X_test, alpha=[0.05, 0.32])
# Uncertainty quantification for classification problem
import numpy as np
from sklearn.linear_model import LogisticRegression
from sklearn.datasets import make_blobs
from sklearn.model_selection import train_test_split

from mapie.classification import MapieClassifier


X, y = make_blobs(n_samples=500, n_features=2, centers=3)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5)

classifier = LogisticRegression()

mapie_classifier = MapieClassifier(estimator=classifier, method='score', cv=5)

mapie_classifier = mapie_classifier.fit(X_train, y_train)
y_pred, y_pis = mapie_classifier.predict(X_test, alpha=[0.05, 0.32])

📘 Documentation

The full documentation can be found on this link.

📝 Contributing

You are welcome to propose and contribute new ideas. We encourage you to open an issue so that we can align on the work to be done. It is generally a good idea to have a quick discussion before opening a pull request that is potentially out-of-scope. For more information on the contribution process, please go here.

🤝 Affiliations

MAPIE has been developed through a collaboration between Quantmetry, Michelin, ENS Paris-Saclay, and with the financial support from Région Ile de France and Confiance.ai.

Quantmetry Michelin ENS Confiance.ai IledeFrance

🔍 References

[1] Vovk, Vladimir, Alexander Gammerman, and Glenn Shafer. Algorithmic Learning in a Random World. Springer Nature, 2022.

[2] Angelopoulos, Anastasios N., and Stephen Bates. “Conformal prediction: A gentle introduction.” Foundations and Trends® in Machine Learning 16.4 (2023): 494-591.

[3] Rina Foygel Barber, Emmanuel J. Candès, Aaditya Ramdas, and Ryan J. Tibshirani. “Predictive inference with the jackknife+.” Ann. Statist., 49(1):486–507, (2021).

[4] Kim, Byol, Chen Xu, and Rina Barber. “Predictive inference is free with the jackknife+-after-bootstrap.” Advances in Neural Information Processing Systems 33 (2020): 4138-4149.

[5] Sadinle, Mauricio, Jing Lei, and Larry Wasserman. “Least ambiguous set-valued classifiers with bounded error levels.” Journal of the American Statistical Association 114.525 (2019): 223-234.

[6] Romano, Yaniv, Matteo Sesia, and Emmanuel Candes. “Classification with valid and adaptive coverage.” Advances in Neural Information Processing Systems 33 (2020): 3581-3591.

[7] Angelopoulos, Anastasios, et al. “Uncertainty sets for image classifiers using conformal prediction.” International Conference on Learning Representations (2021).

[8] Romano, Yaniv, Evan Patterson, and Emmanuel Candes. “Conformalized quantile regression.” Advances in neural information processing systems 32 (2019).

[9] Xu, Chen, and Yao Xie. “Conformal prediction interval for dynamic time-series.” International Conference on Machine Learning. PMLR, (2021).

[10] Bates, Stephen, et al. “Distribution-free, risk-controlling prediction sets.” Journal of the ACM (JACM) 68.6 (2021): 1-34.

[11] Angelopoulos, Anastasios N., Stephen, Bates, Adam, Fisch, Lihua, Lei, and Tal, Schuster. “Conformal Risk Control.” (2022).

[12] Angelopoulos, Anastasios N., Stephen, Bates, Emmanuel J. Candès, et al. “Learn Then Test: Calibrating Predictive Algorithms to Achieve Risk Control.” (2022).

📝 License

MAPIE is free and open-source software licensed under the 3-clause BSD license.

📚 Citation

If you use MAPIE in your research, please cite using citations file on our repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

MAPIE-0.8.5.tar.gz (173.5 kB view details)

Uploaded Source

Built Distribution

MAPIE-0.8.5-py3-none-any.whl (143.2 kB view details)

Uploaded Python 3

File details

Details for the file MAPIE-0.8.5.tar.gz.

File metadata

  • Download URL: MAPIE-0.8.5.tar.gz
  • Upload date:
  • Size: 173.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.9.19

File hashes

Hashes for MAPIE-0.8.5.tar.gz
Algorithm Hash digest
SHA256 b33e7f4fab1725254066639fb7bc412790a07754a93f617ac5547f899feb7112
MD5 9246708cdc566d96a6086e5311587b85
BLAKE2b-256 4f0078a0f10f1d4c1329b6f82b0eea5c83be4bbdbe764042d44fc4cd9a6863d9

See more details on using hashes here.

File details

Details for the file MAPIE-0.8.5-py3-none-any.whl.

File metadata

  • Download URL: MAPIE-0.8.5-py3-none-any.whl
  • Upload date:
  • Size: 143.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.9.19

File hashes

Hashes for MAPIE-0.8.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d80cd0fa1f003822dd78d8934091ccfe2ce1d1d3f3776d90fc36371132fb9de2
MD5 b810c8bcf0c0d2c5f7fe1e345433e295
BLAKE2b-256 37ee6447a94017e76712596125bb37c1fa651ef0d3f26a2f47266b213dda6ced

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page