Skip to main content

A package with utility functions for evaluating conformal predictors

Project description

Conformal-eval

A Python library for simplifying the evaluation of conformal predictors.

Installation

Installation of Conformal-eval is made by;

pip install conformal-eval

or including the extras report:

pip install conformal-eval[report]
# or, depending on your shell you might need:
pip install "conformal-eval[report]"

Examples

Examples of using the package can be seen in our examples notebooks:

Package dependencies

Package dependencies can be found in the pyproject.toml under [tool.poetry.dependencies], noting that the Jinja2 package is only required for for the report extras (installing conformal-eval[report]).

API

Data format

The code internally use numpy ndarrays for matrices and vectors, but tries to be agnostic about input being either list, arrays or Pandas equivalents. For performance reasons it is recommended that conversion to numpy format is done when using several of the methods in this library, as this is internally made in most functions.

For regression we require predictions to be the same as used in nonconformist, using 2D or 3D tensors in numpy ndarrays of shape (num_examples,2) or (num_examples,2,num_significance_levels), where the second dimension contains the lower and upper limits of the prediction intervals.

Rendering backend

Plotting is made based on the matplotlib library and Seaborn for the function plot_confusion_matrix_heatmap. To easily set a nicer plotting style (in our opinion) you can use the function conf_eval.plot.update_plot_settings which uses Seaborn to configure the default settings in matplotlib and e.g. easily scales font size.

The conf_eval.cpsign module

This module and submodules are intended for easily loading results from CPSign in the format required by Conformal-eval

Extras report

This extras include the functionality to generate a "report" in HTML format for a model generated by CPSign. This report is at this point only in beta testing and contains rudimentary information - this could be altered in the future. Further note that this will install a CLI script called cpsign-report once installing the conformal-eval[report] package, to see the usage of this simply run cpsign-report --help in your terminal environment.

Supported plots

Classification

  • Calibration plot
  • Label ratio plot, showing ratio of single/multi/empty predictions for each significance level
  • p-value distribution plot: plot p-values as a scatter plot
  • "Bubble plot" confusion matrix
  • Heatmap confusion matrix

Regression

  • Calibration plot
  • Efficiency plot (mean or median prediction interval width vs significance)
  • Prediction intervals (for a given significance level)

Developer notes

Testing

All python-tests are located in the tests folder and are meant to be run using pytest. Test should be started from standing in the python folder and can be run "all at once" (python -m pytest), "per file" (python -m pytest tests/conf_eval/metrics/clf_metrics_test.py), or a single test function (python -m pytest tests/conf_eval/metrics/clf_metrics_test.py::TestConfusionMatrix::test_with_custom_labels).

  • Note1: The invocation python -m pytest [opt args] is preferred here as the current directory is added to the python path and resolves the application code automatically. Simply running pytest requires manual setup of the PYTHONPATH instead.
  • Note2: The plotting tests generate images that are saved in the test_output directory and these should be checked manually (no good way of automating plotting-tests).

Before running:

For the report module there are CLI tests that require the package to be installed before running them (otherwise the cpsign-report program is not available, or not updated). To do this you should use the following;

# Set up an venv to run in
poetry shell
# Install dependencies from the pyproject.toml
poetry install
# Run all (or subset) of tests
python -m pytest

TODOs:

Add/finish the following plots:

  • calibration plot - Staffan
  • 'area plot' with label-distributions - Staffan
  • bubbel-plot - Jonathan
  • heatmap - Staffan
  • p0-p1 plot - Staffan
  • Add regression metrics
  • Add plots regression

Change log:

  • 1.0.0b3:
    • Allow to configure the 'static' output directory, to e.g. avoid path clashes with other files.
    • Change links in README to absolute urls to github, so they work at PyPI when deployed.
  • 1.0.0b2:
    • Added the png-logo files for the cpsign-report utility (gitignore had removed them before).
  • 1.0.0b1:
    • Forked the plot_utils library to make an extensive refactor of module names.
    • Added conf_eval.cpsign.report module for generating model-reports in HTML format. The extras report is needed to run this code.
    • Fixed bugs when loading data from CPSign, e.g. missing standard-deviation columns in validation metrics.
    • Added new flag (skip_inf) when loading regression predictions and results, where a too high confidence level leads to infinitely large prediction intervals. This simply filters out the rows that have infinitely large outputs.
    • Added new function to load confidence-independent metrics from CPSign validation output (cpsign.load_conf_independent_metrics).
    • Added Type hints on a few places (not finished everywhere yet).
  • 0.1.0:
    • Added pharmbio.cpsign package with loading functionality for CPSign generated files, loading calibration statistics, efficiency statistics and predictions.
    • Updated plotting functions in order to use pre-computed metrics where applicable (e.g. when computed by CPSign).
    • Added possibility to add a shading for +/- standard deviation where applicable, e.g. calibration curve
    • Updated calibration curve plotting to have a general plotting.plot_calibration acting on pre-computed values or for classification using plotting.plot_calibration_clf where true labels and p-values can be given.
    • Update parameter order to make it consistent across plotting functions, e.g. ordered as x, y (significance vs error rate) in the plots.
    • Added a utility function for setting the seaborn theme and context using plotting.update_plot_settings which updates the matplotlib global settings. Note this will have effect on all other plots generated in the same python session if those rely on matplotlib.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

conformal_eval-1.0.0b3.tar.gz (111.7 kB view hashes)

Uploaded Source

Built Distribution

conformal_eval-1.0.0b3-py3-none-any.whl (115.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page