Skip to main content

Framework for conditional density estimation

Project description

CI Downloads

Conditional Density Estimation (CDE)

Update: Conditional Density Estimation now runs with PyTorch (with the help of cursor + GPT-5.1 Codex Mini). The legacy TensorFlow implementation lives in the tensorflow branch. All core estimators, runners, and examples are tested with Python 3.12.1 and PyTorch 2.9.1. Logging is now done via wandb (offline/online mode & additional csv/pd dataframes outputs).

Description

Python/PyTorch implementations of various methods for conditional density estimation

  • Parametric neural network based methods
  • Nonparametric methods
    • Conditional Kernel Density Estimation (CKDE)
    • Neighborhood Kernel Density Estimation (NKDE)
  • Semiparametric methods
    • Least Squares Conditional Density Estimation (LSKDE)

Beyond estimating conditional probability densities, the package features extensive functionality for computing:

  • Centered moments: mean, covariance, skewness and kurtosis
  • Statistical divergences: KL-divergence, JS-divergence, Hellinger distance
  • Percentiles and expected shortfall

For the parametric models (MDN, KMN, NF), we recommend the usage of noise regularization which is supported by our implementation. For details, we refer to the paper Noise Regularization for Conditional Density Estimation.

Dependencies

  • Python 3.12
  • PyTorch 2.9.1
  • NumPy
  • pandas
  • scipy
  • scikit-learn
  • wandb

Installation

Clone the repository and run the provided script to create the cde-pytorch Conda environment (Python 3.11 with CPU PyTorch plus the pinned NumPy/SciPy versions that are tested with CDE):

bash scripts/setup_pytorch_env.sh

After you activate the environment, install the local package in editable mode:

pip install --break-system-packages -e .

Prior to running experiments, set your Weights & Biases API key so wandb logging works (export WANDB_API_KEY=<your-key> or configure it via wandb login for online runs). You can store that key in a .env file (WANDB_API_KEY=…) and source it before launch. The tracking helpers also write CSV/PD outputs into wandb/ when enabled.

If you already have a PyTorch environment, you can install the package with pip install cde; the runtime expects the usual scientific stack (numpy, scipy, pandas, matplotlib) and ml_logger.

Documentation and paper

See the documentation here. A paper on best practices and benchmarks on conditional density estimation with neural networks that makes extensive use of this library can be found here.

Usage

The following code snipped holds an easy example that demonstrates how to use the cde package.

from cde.density_simulation import SkewNormal
from cde.density_estimator import KernelMixtureNetwork
import numpy as np

""" simulate some data """
density_simulator = SkewNormal(random_seed=22)
X, Y = density_simulator.simulate(n_samples=3000)

""" fit density model """
model = KernelMixtureNetwork("KDE_demo", ndim_x=1, ndim_y=1, n_centers=50,
                             x_noise_std=0.2, y_noise_std=0.1, random_seed=22)
model.fit(X, Y)

""" query the conditional pdf and cdf """
x_cond = np.zeros((1, 1))
y_query = np.ones((1, 1)) * 0.1
prob = model.pdf(x_cond, y_query)
cum_prob = model.cdf(x_cond, y_query)

""" compute conditional moments & VaR  """
mean = model.mean_(x_cond)[0][0]
std = model.std_(x_cond)[0][0]
skewness = model.skewness(x_cond)[0]

Citing

If you use our CDE implementation in your research, you can cite it as follows:

@article{rothfuss2019conditional,
  title={Conditional Density Estimation with Neural Networks: Best Practices and Benchmarks},
  author={Rothfuss, Jonas and Ferreira, Fabio and Walther, Simon and Ulrich, Maxim},
  journal={arXiv:1903.00954},
  year={2019}
}

If you use noise regularization for regularizing the MDN, KMN or NF conditional density model, please cite

@article{rothfuss2019noisereg,
    title={Noise Regularization for Conditional Density Estimation},
    author={Jonas Rothfuss and Fabio Ferreira and Simon Boehm and Simon Walther 
            and Maxim Ulrich and Tamim Asfour and Andreas Krause},
    year={2019},
    journal={arXiv:1907.08982},
}

Todo

  • track configuration for the new PyTorch branch and keep the legacy TensorFlow branch discoverable

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cde-1.0.1.tar.gz (96.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cde-1.0.1-py3-none-any.whl (153.2 kB view details)

Uploaded Python 3

File details

Details for the file cde-1.0.1.tar.gz.

File metadata

  • Download URL: cde-1.0.1.tar.gz
  • Upload date:
  • Size: 96.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for cde-1.0.1.tar.gz
Algorithm Hash digest
SHA256 1b21c85ce69d0c7b27015682a6ee7824f4dff9c9de43185ce0c743324dd8f255
MD5 19d715cf524dcd6cf00707c8c6726e60
BLAKE2b-256 bc3a455b72d22b63503317c7725c39ecf784eb8df7cb393c43d71a587e066c3f

See more details on using hashes here.

File details

Details for the file cde-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: cde-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 153.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for cde-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 065cb6d515896d0928539dfd9162261bef0bf3670738777f0d00e124579b9903
MD5 95e7c32f6efde27d5da2da5217ade504
BLAKE2b-256 a5dc33c61a7d80dcc6a57ff8682b4b55ab8b5c3da0db7b689261b8df3010cbfe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page