Skip to main content

CaliPy (Calibration library Python) is a modular library for building and solving probabilistic models of measurement devices. Designed for scientific calibration tasks and uncertainty quantification.

Project description

CaliPy: Calibration Library in Python

PyPI version Python versions License

Pre-Alpha: In active development — not yet production-ready


Table of Contents


Introduction

CaliPy (Calibration library Python) is designed to build and solve probabilistic instrument models of measurement instruments. CaliPy lets you declare probabilistic models for measurement instruments (e.g., temperature sensors, geodetic instruments, optical devices, ...) by chaining modular "effects" such as drifts, noise, and nonlinear transformations. Inference is then handled by the autodiff backbone. Built for scientists, engineers, and calibration specialists who need flexible Bayesian or maximum likelihood inference – without manual math.

Powered by PyTorch + Pyro · Dimension-aware · Modular Effects · Research-proven

While many real-world analyses rely on classical least squares (LS) estimation or maximum-likelihood solutions, these approaches can fail for:

  • Nonlinear systems
  • Non-Gaussian noise
  • Latent or unobserved variables
  • Large-scale or sub-batched data

CaliPy addresses these limitations by:

  • Leveraging Pyro for flexible Bayesian inference (e.g., Stochastic Variational Inference (Blei et al., 2017)).
  • Allowing chainable “stochastic effects” to reflect real-world complexities in measurement instruments.
  • Encouraging a “declare-then-solve” approach: once the user sets up a model, CaliPy automatically handles inference—whether you want a posterior distribution or a maximum-likelihood point estimate.

Originally developed for geodetic measurement instruments, CaliPy can also apply to other domains requiring advanced calibration or hierarchical Bayesian workflows under a friendly, composable interface.


Key Features

  1. Chainable Instrument Models
    Construct models from small, pre-built node classes (like UnknownParameter, NoiseAddition) that capture aspects such as bias, drift, or axis misalignments in measurement instruments.

  2. Seamless Integration with PyTorch & Pyro
    CaliPy builds on PyTorch (Paszke et al., 2019) for automatic differentiation and Pyro (Bingham et al., 2018) for advanced Bayesian inference algorithms.

  3. Bayesian or Classical

    • For linear Gaussian models, maximum-likelihood solutions coincide with least squares (LS).
    • For more general models, Stochastic Variational Inference or other Pyro-based inference can handle non-Gaussian, nonlinear, or latent-variable scenarios automatically.
  4. Dimension-Aware Data Structures
    CaliPy offers dimension-handling (inspired by functorch.torchdims), letting you specify shapes, subbatching strategies, and independence assumptions with minimal overhead.

  5. Rapid Prototyping & Extendability

    • The library’s architecture is modular; you can add new “effects” for specialized phenomena (e.g., scanning-laser sensor errors, environment-driven drifts).
    • A “declare-then-solve” approach means minimal user code from model definition to inference result.

Architecture Overview

  1. Node Classes
    Each effect (e.g., UnknownParameter, NoiseAddition) inherits from a base node class, providing a forward() method for random draws or deterministic transformations.

  2. Indexing & Subbatching

    • CalipyIndexer: centralizes creation of local/global index tensors, block indices for subbatching, and index naming.
    • CalipyObservation: aggregates data plus metadata about batch/event dims, bridging user data with inference routines.
  3. CalipyProbModel

    • Users typically write a model() (and optionally a guide()) capturing the generative structure of their problem.
    • This class compiles your code with Pyro’s back-end to perform SVI or other inference algorithms seamlessly.

Bayesian vs. Classical (LS) Estimation

  1. Classical Approach

    • Least Squares (LS) solves a maximum-likelihood problem in Gaussian, linear scenarios by minimizing [ \sum \bigl(\text{predictions} - \text{observations}\bigr)^2. ]
    • It has closed-form solutions for linear, Gaussian assumptions (Ghilani and Wolf, 2006).
  2. Bayesian Extension

    • Instead of a single estimate (\hat{\theta}), we specify a full model: [ p(\theta, \mathrm{data}) = p(\mathrm{data}\mid\theta),p(\theta). ] Then infer the posterior [ p(\theta\mid \mathrm{data}) = \frac{p(\mathrm{data}\mid\theta),p(\theta)}{p(\mathrm{data})}. ]
    • Stochastic Variational Inference (SVI) approximates the posterior with an ELBO objective, bridging complex or large-scale problems.
    • This approach gracefully handles non-Gaussian noise, nonlinear relationships, or latent variables—cases where LS is no longer straightforward.
  3. Connection

    • In purely Gaussian, linear scenarios with non-informative priors, Bayesian approaches reduce to classical LS solutions.
    • For more complex or large-scale problems, the same code in CaliPy can use SVI or MCMC to produce approximate posteriors or ML solutions.

Installation

Project Status: Pre-alpha

Calipy is published on PyPI, and can be installed via:

pip install calipy

If you want the editable bleeding edge version, clone the github repository and install manually:

git clone https://github.com/atlasoptimization/calipy.git
cd calipy
pip install -e .

Quick Example

Below is a toy snippet demonstrating how you might declare a simple bias-plus-noise model:

Quickstart Example with Calipy

import pyro
import matplotlib.pyplot as plt

from calipy.base import NodeStructure, CalipyProbModel
from calipy.effects import UnknownParameter, NoiseAddition
from calipy.utils import dim_assignment
from calipy.tensor import CalipyTensor

# Simulate data
n_meas = 20
mu_true, sigma_true = 0.0, 0.1
data = pyro.distributions.Normal(mu_true, sigma_true).sample([n_meas])

# Define dimensions
batch_dims = dim_assignment(['batch'], [n_meas])
single_dims = dim_assignment(['single'], [])

# Set up model nodes
mu_ns = NodeStructure(UnknownParameter)
mu_ns.set_dims(batch_dims=batch_dims, param_dims = single_dims)
mu_node = UnknownParameter(mu_ns, name='mu')
noise_ns = NodeStructure(NoiseAddition)
noise_ns.set_dims(batch_dims=batch_dims, event_dims = single_dims)
noise_node = NoiseAddition(noise_ns, name='noise')

# Define probabilistic model
class DemoProbModel(CalipyProbModel):
    def model(self, input_vars = None, observations=None):
        mu = mu_node.forward()
        return noise_node.forward({'mean': mu, 'standard_deviation': sigma_true}, observations)

    def guide(self, input_vars = None, observations=None):
        pass

# Train model
# Train model
demo_probmodel = DemoProbModel()
data_cp = CalipyTensor(data, dims=batch_dims)
optim_results = demo_probmodel.train(None, data_cp, optim_opts = {})

# Plot results
plt.plot(optim_results)
plt.xlabel('Epoch'); plt.ylabel('ELBO loss'); plt.title('Training Progress')
plt.show()

This snippet shows how you might define a node-based approach to an unknown parameter (\mu) and noise, letting CaliPy handle the inference behind the scenes.


Use Cases

The following three example were presented at JISDM 2025 in Karlsruhe; documented code can be found in the examples folder

  1. Tape Bias Estimation

    • Classic scenario: measure a known rod length with a tape that has an unknown offset (\theta).
    • Equivalent to linear maximum-likelihood in simplest form, but easily extended in CaliPy for more complex error structures or prior knowledge.
  2. Two-Peg Test (Level Collimation)

    • Solve for the collimation angle (\alpha) in a leveling instrument.
    • Chain multiple observations across distinct geometric setups. For more complicated geometry or weighting, SVI seamlessly generalizes the solution.
  3. Axis Errors in Total Stations

    • Model collimation or trunnion axis misalignments, even under strongly nonlinear geometry or face configurations.
    • Simple to incorporate discrete “face” variables, e.g. Face I/Face II, in a single forward pass.

References

  • Probabilistic Programming

    • Bingham, E., et al. (2018). Pyro: Deep Universal Probabilistic Programming. arXiv e-prints, 1810.09538.
    • Blei, D. M., Kucukelbir, A., & McAuliffe, J. D. (2017). Variational Inference: A Review for Statisticians. JASA, 112(518), 859–877.
    • Gelman, A., et al. (2020). Bayesian Workflow. arXiv:2011.01808.
    • Paszke, A., et al. (2019). PyTorch: an imperative style, high-performance deep learning library. NeurIPS.
  • Calibration, Geodesy & LS

    • Ghilani, C. D., & Wolf, P. R. (2006). Adjustment Computations - Spatial Data Analysis. Wiley.
    • Uren, J. & Price, W. F. (2006). Surveying for Engineers. Palgrave Macmillan.
    • Phillips, S. D. et al. (2001). A Careful Consideration of the Calibration Concept. J. Res. NIST.
    • Deumlich, F. (1980). Instrumentenkunde der Vermessungstechnik. VEB Verlag fuer Bauwesen.
    • Hastie, T., Tibshirani, R., & Friedman, J. (2009). The Elements of Statistical Learning. Springer.
    • Boyd, S. & Vandenberghe, L. (2004). Convex Optimization. Cambridge University Press.

License

CaliPy is released under the Prosperity Public License. Free for non-commercial use. Commercial licensing available – contact info@atlasoptimization.com. See LICENSE for details.


Contributing

The project is currently Pre-Alpha, so expect changes. We welcome bug reports, new effect classes, and general improvements. To contribute:

  1. Fork the GitHub repo.
  2. Make your changes or propose new classes/effects.
  3. Open a pull request to discuss your contribution.

Thank you for your interest in CaliPy! We hope it accelerates your research or engineering in advanced instrument calibration and beyond.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

calipy_ppl-0.5.2.tar.gz (86.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

calipy_ppl-0.5.2-py3-none-any.whl (87.9 kB view details)

Uploaded Python 3

File details

Details for the file calipy_ppl-0.5.2.tar.gz.

File metadata

  • Download URL: calipy_ppl-0.5.2.tar.gz
  • Upload date:
  • Size: 86.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.3

File hashes

Hashes for calipy_ppl-0.5.2.tar.gz
Algorithm Hash digest
SHA256 afe11975b3d488d0eff9e863a0ac6543ffc5d24854076f7c45a6b4c6e9141dfa
MD5 f11e97d750ecb748df9ab94d38ab2218
BLAKE2b-256 6c6b17238372442eeb6c983589532d75e89a0dcabe34a9e4729ed42bef275954

See more details on using hashes here.

File details

Details for the file calipy_ppl-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: calipy_ppl-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 87.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.3

File hashes

Hashes for calipy_ppl-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a9e41ddc2c89e6db73cbf200414d801140c5ab1d466b355aecf4e85298596274
MD5 286cf89359e732fea095402b5f1681a5
BLAKE2b-256 2cdca9403a1f12bb9de6f61c7b0c6858de3578c409ace09aaa7f92873df798b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page