Skip to main content

PyTorch-based implementations of the Continuously-Ranked Probability Score (CRPS) as well as its locally scale-invariant version (SCRPS)

Project description

torch-crps

License: CC-BY-4.0 python Docs CD Coverage Tests mkdocs-material mypy pre-commit pytest Ruff uv

PyTorch-based implementations of the Continuously-Ranked Probability Score (CRPS) as well as its locally scale-invariant version (SCRPS)

Background

Continuously-Ranked Probability Score (CRPS)

The CRPS is a strictly proper scoring rule. It assesses how well a distribution with the cumulative distribution function $F(X)$ of the estimate $X$ (a random variable) is explaining an observation $y$

$$ \text{CRPS}(F,y) = \int _{\mathbb {R}} \left( F(x)-\mathbb {1} (x\geq y) \right)^{2} dx $$

where $1$ denoted the indicator function.

In Section 2 of this paper Zamo & Naveau list 3 different formulations of the CRPS. One of them is

$$ \text{CRPS}(F, y) = E[|X - y|] - 0.5 E[|X - X'|] = E[|X - y|] + E[X] - 2 E[X F(X)] $$

which can be shortened to

$$ \text{CRPS}(F, y) = A - 0.5 D $$

where $A$ is called the accuracy term and $D$ is called the disperion term (at least I do it in this repo).

Scaled Continuously-Ranked Probability Score (SCRPS)

The SCRPS is a locally scale-invariant version of the CRPS. In their paper, Bolling & Wallin define it in a positively-oriented, i.e., higher is better. In contrast, I implement the SCRPS in this repo negatively-oriented, just like a loss function.

Oversimplifying the notation, the (negatively-oriented) SCRPS can be written as

$$ \text{SCRPS}(F, y) = -\frac{E[|X - y|]}{E[|X - X'|]} - 0.5 \log \left( E[|X - X'|] \right) $$

which can be shortened to

$$ \text{SCRPS}(F, y) = \frac{A}{D} + 0.5 \log(D) $$

The scale-invariance, i.e., the SCRPS value does not depend on the magnitude of $D$, comes from the division by $D$.

Note that the SCRPS can, in contrast to the CRPS, yield negative values.

Incomplete list of sources that I came across while researching about the CRPS

  • Hersbach, "Decomposition of the Continuous Ranked Probability Score for Ensemble Prediction Systems"; 2000
  • Gneiting et al.; "Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistis and Minimum CRPS Estimation"; 2004
  • Gneiting & Raftery; "Strictly Proper Scoring Rules, Prediction, and Estimation"; 2007
  • Zamo & Naveau; "Estimation of the Continuous Ranked Probability Score with Limited Information and Applications to Ensemble Weather Forecasts"; 2018
  • Jordan et al.; "Evaluating Probabilistic Forecasts with scoringRules"; 2019
  • Bollin & Wallin; "Local scale invariance and robustness of proper scoring rules"; 2029
  • Olivares & Négiar & Ma et al; "CLOVER: Probabilistic Forecasting with Coherent Learning Objective Reparameterization"; 2023
  • Vermorel & Tikhonov; "Continuously-Ranked Probability Score (CRPS)" blog post; 2024
  • Nvidia; "PhysicsNeMo Framework" source code; 2025
  • Zheng & Sun; "MVG-CRPS: A Robust Loss Function for Multivariate Probabilistic Forecasting"; 2025

Application to Machine Learning

The CRPS, as well as the SCRPS, can be used as a loss function in machine learning, just like the well-known negative log-likelihood loss which is the log scoring rule.

The parametrized model outputs a distribution $q(x)$. The CRPS loss evaluates how good $q(x)$ is explaining the observation $y$. This is a distribution-to-point evaluation, which fits well for machine learning as the ground truth $y$ almost always comes as fixed values.

For processes over time and/or space, we need to estimate the CRPS for every point in time/space separately.

There is work on multi-variate CRPS estimation, but it is not part of this repo.

Implementation

The direct implementation of the integral formulation is not suited to evaluate on a computer due to the infinite integration over the domain of the random variable $X$. Nevertheless, this repository includes such an implementation to verify the others.

The normalization-by-observation variants are improper solutions to normalize the CPRS values. The goal is to use the CPRS as a loss function in machine learning tasks. For that, it is highly beneficial if the loss does not depend on the scale of the problem. However, deviding by the absolute maximum of the observations is a bad proxy for doing this. I plan on removing these methods once I gained trust in my SCRPS implementation.

I found Nvidia's implementation of the CRPS for ensemble preductions in $M log(M)$ time inspiring to read.

:point_right: Please have a look at the documentation to get started.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_crps-2.0.0.tar.gz (116.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torch_crps-2.0.0-py3-none-any.whl (14.1 kB view details)

Uploaded Python 3

File details

Details for the file torch_crps-2.0.0.tar.gz.

File metadata

  • Download URL: torch_crps-2.0.0.tar.gz
  • Upload date:
  • Size: 116.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for torch_crps-2.0.0.tar.gz
Algorithm Hash digest
SHA256 5878f2431a6144662a5d86f5181a5229851eac137b84b0c58b9c2d3cee144b82
MD5 22349fd61653e03f35fdb2172cc57b69
BLAKE2b-256 3136c135652b113c99520dc2c2b86df65a3fd957affc440e6796e8e5f7a566f7

See more details on using hashes here.

Provenance

The following attestation bundles were made for torch_crps-2.0.0.tar.gz:

Publisher: cd.yaml on famura/torch-crps

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file torch_crps-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: torch_crps-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 14.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for torch_crps-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ba1f61739a22867bd5b8850afb1e584400029f9f103b42998288cd0bc4e689e3
MD5 b14cd0b67406b49c96153d715532517a
BLAKE2b-256 ad3922565c20b9a894906b7e055d89525be167e496e29d18e56b7ce643f6fcf8

See more details on using hashes here.

Provenance

The following attestation bundles were made for torch_crps-2.0.0-py3-none-any.whl:

Publisher: cd.yaml on famura/torch-crps

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page