Tools for diagnostics and assessment of (machine learning) models
Project description
model-diagnostics
CI/CD | |
Docs | |
Package | |
Meta |
Tools for diagnostics and assessment of (machine learning) models
Highlights:
- All common point predictions covered: mean, median, quantiles, expectiles.
- Assess model calibration with identification functions (generalized residuals) and compute_bias.
- Assess calibration and bias graphically
- reliability diagrams for auto-calibration
- bias plots for conditional calibration
- Assess the predictive performance of models
- strictly consistent, homogeneous scoring functions
- score decomposition into miscalibration, discrimination and uncertainty
- Choose your plot backend, either matplotlib or plotly, e.g., via set_config.
:rocket: To our knowledge, this is the first python package to offer reliability diagrams for quantiles and expectiles and a score decomposition, both made available by an internal implementation of isotonic quantile/expectile regression. :rocket:
Read more in the documentation.
This package relies on the giant shoulders of, among others, polars, matplotlib, scipy and scikit-learn.
Installation
pip install model-diagnostics
Contributions
Contributions are warmly welcome! When contributing, you agree that your contributions will be subject to the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for model_diagnostics-1.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 92e21794f44fe3d7c59becbb2e5baf0b5836fc3a571fa72f4caba3f072be42f0 |
|
MD5 | a434212d15e985c16c2f96c6b26f1af8 |
|
BLAKE2b-256 | f6e8c97bf32a0b0bc13c65a083b3de325f1f22928c928b680857c558d3ee061a |