Tools for diagnostics and assessment of (machine learning) models
Project description
model-diagnostics
CI/CD | |
Docs | |
Package | |
Meta |
Tools for diagnostics and assessment of (machine learning) models
Highlights:
- All common point predictions covered: mean, median, quantiles, expectiles.
- Assess model calibration with identification functions (generalized residuals) and compute_bias.
- Assess calibration and bias graphically
- reliability diagrams for auto-calibration
- bias plots for conditional calibration
- Assess the predictive performance of models
- strictly consistent, homogeneous scoring functions
- score decomposition into miscalibration, discrimination and uncertainty
:rocket: To our knowledge, this is the first python package to offer reliability diagrams for quantiles and expectiles and a score decomposition, both made available by an internal implementation of isotonic quantile/expectile regression. :rocket:
Read more in the documentation.
This package relies on the giant shoulders of, among others, polars, matplotlib, scipy and scikit-learn.
Installation
pip install model-diagnostics
Contributions
Contributions are warmly welcome! When contributing, you agree that your contributions will be subject to the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for model_diagnostics-1.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a27c3b92dbec0a20909bc201cbbcb4636c4db2a4b245501823b6733d1e37af3f |
|
MD5 | c77006a690134bf08e29e8337e0df665 |
|
BLAKE2b-256 | 9df741a85e1e7608d2e6840156d028983ca03065ae7041792557ef8842c6b669 |