Skip to main content

Implementations of the CRPS using PyTorch

Project description

torch-crps

License: CC-BY-4.0 python Docs CD Coverage Tests mkdocs-material mypy pre-commit pytest Ruff uv

Implementations of the Continuously-Ranked Probability Score (CRPS) using PyTorch

Background

The Continuously-Ranked Probability Score (CRPS) is a strictly proper scoring rule. It assesses how well a distribution with the cumulative distribution function $F$ is explaining an observation $y$

$$ \text{CRPS}(F,y) = \int _{\mathbb {R} }(F(x)-\mathbb {1} (x\geq y))^{2}dx \qquad (\text{integral formulation}) $$

where $1$ denoted the indicator function.

In Section 2 of this paper Zamo & Naveau list 3 different formulations of the CRPS.

Incomplete list of sources that I came across while researching about the CRPS

  • Hersbach, "Decomposition of the Continuous Ranked Probability Score for Ensemble Prediction Systems"; 2000
  • Gneiting et al.; "Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistis and Minimum CRPS Estimation"; 2004
  • Gneiting & Raftery; "Strictly Proper Scoring Rules, Prediction, and Estimation"; 2007
  • Zamo & Naveau; "Estimation of the Continuous Ranked Probability Score with Limited Information and Applications to Ensemble Weather Forecasts"; 2018
  • Jordan et al.; "Evaluating Probabilistic Forecasts with scoringRules"; 2019
  • Olivares & Négiar & Ma et al; "CLOVER: Probabilistic Forecasting with Coherent Learning Objective Reparameterization"; 2023
  • Vermorel & Tikhonov; "Continuously-Ranked Probability Score (CRPS)" blog post; 2024
  • Nvidia; "PhysicsNeMo Framework" source code; 2025
  • Zheng & Sun; "MVG-CRPS: A Robust Loss Function for Multivariate Probabilistic Forecasting"; 2025

Application to Machine Learning

The CRPS can be used as a loss function in machine learning, just like the well-known negative log-likelihood loss which is the log scoring rule.

The parametrized model outputs a distribution $q(x)$. The CRPS loss evaluates how good $q(x)$ is explaining the observation $y$. This is a distribution-to-point evaluation, which fits well for machine learning as the ground truth $y$ almost always comes as fixed values.

For processes over time and/or space, we need to estimate the CRPS for every point in time/space separately.

There is work on multi-variate CRPS estimation, but it is not part of this repo.

Implementation

The integral formulation is infeasible to naively evaluate on a computer due to the infinite integration over $x$.

I found Nvidia's implementation of the CRPS for ensemble preductions in $M log(M)$ time inspiring to read.

:point_right: Please have a look at the documentation to get started.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_crps-1.0.0.tar.gz (112.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torch_crps-1.0.0-py3-none-any.whl (10.3 kB view details)

Uploaded Python 3

File details

Details for the file torch_crps-1.0.0.tar.gz.

File metadata

  • Download URL: torch_crps-1.0.0.tar.gz
  • Upload date:
  • Size: 112.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for torch_crps-1.0.0.tar.gz
Algorithm Hash digest
SHA256 6fca0e46f317675e76f37ba45644409203491533662395303adac2377d634390
MD5 c31b7e34e44fbafbe5bff0d6617a4b9c
BLAKE2b-256 221d1efedf969dd794bf619e2cf7a335ce307924dc7120885d9b21bf84ed8c13

See more details on using hashes here.

File details

Details for the file torch_crps-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: torch_crps-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 10.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for torch_crps-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cff4f797323bb92f072cbab2e531b632c9966bcdda10cec4aa6356d88b8b80f7
MD5 e274adf5a44851bf80e29fd0d664cd36
BLAKE2b-256 dae35dbab64e7ba881b923ab2817743d2ff78063e91c70f509ce2c33f66b3a52

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page