Skip to main content

Amplitude analysis made short and sweet

Project description

Amplitude analysis made short and sweet

GitHub Release GitHub last commit GitHub Actions Workflow Status GitHub License Crates.io Version docs.rs Read the Docs Codecov PyPI - Version CodSpeed Badge

laddu (/ˈlʌduː/) is a library for analysis of particle physics data. It is intended to be a simple and efficient alternative to some of the other tools out there. laddu is written in Rust with bindings to Python via PyO3 and maturin and is the spiritual successor to rustitude, one of my first Rust projects. The goal of this project is to allow users to perform complex amplitude analyses (like partial-wave analyses) without complex code or configuration files.

[!CAUTION] This crate is still in an early development phase, and the API is not stable. It can (and likely will) be subject to breaking changes before the 1.0.0 version release (and hopefully not many after that).

Table of Contents

Key Features

  • A simple interface focused on combining Amplitudes into models which can be evaluated over Datasets.
  • A single Amplitude trait which makes it easy to write new amplitudes and integrate them into the library.
  • Easy interfaces to precompute and cache values before the main calculation to speed up model evaluations.
  • Efficient parallelism using rayon.
  • Python bindings to allow users to write quick, easy-to-read code that just works.

Installation

laddu can be added to a Rust project with cargo:

cargo add laddu

The library's Python bindings are located in a library by the same name, which can be installed simply with your favorite Python package manager:

pip install laddu

Quick Start

Rust

Writing a New Amplitude

While it is probably easier for most users to skip to the Python section, there is currently no way to implement a new amplitude directly from Python. At the time of writing, Rust is not a common language used by particle physics, but this tutorial should hopefully convince the reader that they don't have to know the intricacies of Rust to write performant amplitudes. As an example, here is how one might write a Breit-Wigner, parameterized as follows:

I_{\ell}(m; m_0, \Gamma_0, m_1, m_2) =  \frac{1}{\pi}\frac{m_0 \Gamma_0 B_{\ell}(m, m_1, m_2)}{(m_0^2 - m^2) - \imath m_0 \Gamma}

where

\Gamma = \Gamma_0 \frac{m_0}{m} \frac{q(m, m_1, m_2)}{q(m_0, m_1, m_2)} \left(\frac{B_{\ell}(m, m_1, m_2)}{B_{\ell}(m_0, m_1, m_2)}\right)^2

is the relativistic width correction, $q(m_a, m_b, m_c)$ is the breakup momentum of a particle with mass $m_a$ decaying into two particles with masses $m_b$ and $m_c$, $B_{\ell}(m_a, m_b, m_c)$ is the Blatt-Weisskopf barrier factor for the same decay assuming particle $a$ has angular momentum $\ell$, $m_0$ is the mass of the resonance, $\Gamma_0$ is the nominal width of the resonance, $m_1$ and $m_2$ are the masses of the decay products, and $m$ is the "input" mass.

Although this particular amplitude is already included in laddu, let's assume it isn't and imagine how we would write it from scratch:

use laddu::prelude::*;
use laddu::utils::functions::{blatt_weisskopf, breakup_momentum};

#[derive(Clone)]
pub struct MyBreitWigner {
    name: String,
    mass: ParameterLike,
    width: ParameterLike,
    pid_mass: ParameterID,
    pid_width: ParameterID,
    l: usize,
    daughter_1_mass: Mass,
    daughter_2_mass: Mass,
    resonance_mass: Mass,
}
impl MyBreitWigner {
    pub fn new(
        name: &str,
        mass: ParameterLike,
        width: ParameterLike,
        l: usize,
        daughter_1_mass: &Mass,
        daughter_2_mass: &Mass,
        resonance_mass: &Mass,
    ) -> Box<Self> {
        Self {
            name: name.to_string(),
            mass,
            width,
            pid_mass: ParameterID::default(),
            pid_width: ParameterID::default(),
            l,
            daughter_1_mass: daughter_1_mass.clone(),
            daughter_2_mass: daughter_2_mass.clone(),
            resonance_mass: resonance_mass.clone(),
        }
        .into()
    }
}

impl Amplitude for MyBreitWigner {
    fn register(&mut self, resources: &mut Resources) -> Result<AmplitudeID, LadduError> {
        self.pid_mass = resources.register_parameter(&self.mass);
        self.pid_width = resources.register_parameter(&self.width);
        resources.register_amplitude(&self.name)
    }

    fn compute(&self, parameters: &Parameters, event: &Event, _cache: &Cache) -> Complex<Float> {
        let mass = self.resonance_mass.value(event);
        let mass0 = parameters.get(self.pid_mass);
        let width0 = parameters.get(self.pid_width);
        let mass1 = self.daughter_1_mass.value(event);
        let mass2 = self.daughter_2_mass.value(event);
        let q0 = breakup_momentum(mass0, mass1, mass2);
        let q = breakup_momentum(mass, mass1, mass2);
        let f0 = blatt_weisskopf(mass0, mass1, mass2, self.l);
        let f = blatt_weisskopf(mass, mass1, mass2, self.l);
        let width = width0 * (mass0 / mass) * (q / q0) * (f / f0).powi(2);
        let n = Float::sqrt(mass0 * width0 / PI);
        let d = Complex::new(mass0.powi(2) - mass.powi(2), -(mass0 * width));
        Complex::from(f * n) / d
    }
}

Calculating a Likelihood

We could then write some code to use this amplitude. For demonstration purposes, let's just calculate an extended unbinned negative log-likelihood, assuming we have some data and Monte Carlo in the proper parquet format:

let ds_data = open("test_data/data.parquet").unwrap();
let ds_mc = open("test_data/mc.parquet").unwrap();

let resonance_mass = Mass::new([2, 3]);
let p1_mass = Mass::new([2]);
let p2_mass = Mass::new([3]);
let mut manager = Manager::default();
let bw = manager.register(MyBreitWigner::new(
    "bw",
    parameter("mass"),
    parameter("width"),
    2,
    &p1_mass,
    &p2_mass,
    &resonance_mass,
)).unwrap();
let mag = manager.register(Scalar::new("mag", parameter("magnitude"))).unwrap();
let model = (mag * bw).norm_sqr();

let nll = NLL::new(&manager, &ds_data, &ds_mc);
println!("Parameters names and order: {:?}", nll.parameters());
let result = nll.evaluate(&[1.27, 0.120, 100.0], &model);
println!("The extended negative log-likelihood is {}", result);

In practice, amplitudes can also be added together, their real and imaginary parts can be taken, and evaluators should mostly take the real part of whatever complex value comes out of the model.

Python

Fitting Data

While we cannot (yet) implement new amplitudes within the Python interface alone, it does contain all the functionality required to analyze data. Here's an example to show some of the syntax. This models includes three partial waves described by the $Z_{\ell}^m$ amplitude listed in Equation (D13) here[^1]. Since we take the squared norm of each individual sum, they are invariant up to a total phase, thus the S-wave was arbitrarily picked to be purely real.

import laddu as ld
import matplotlib.pyplot as plt
import numpy as np
from laddu import constant, parameter

def main():
    ds_data = ld.open("path/to/data.parquet")
    ds_mc = ld.open("path/to/accmc.parquet")
    angles = ld.Angles(0, [1], [2], [2, 3], "Helicity")
    polarization = ld.Polarization(0, [1])
    manager = ld.Manager()
    z00p = manager.register(ld.Zlm("z00p", 0, 0, "+", angles, polarization))
    z00n = manager.register(ld.Zlm("z00n", 0, 0, "-", angles, polarization))
    z22p = manager.register(ld.Zlm("z22p", 2, 2, "+", angles, polarization))
    
    s0p = manager.register(ld.Scalar("s0p", parameter("s0p")))
    s0n = manager.register(ld.Scalar("s0n", parameter("s0n")))
    d2p = manager.register(ld.ComplexScalar("d2p", parameter("d2 re"), parameter("d2 im")))

    pos_re = (s0p * z00p.real() + d2p * z22p.real()).norm_sqr()
    pos_im = (s0p * z00p.imag() + d2p * z22p.imag()).norm_sqr()
    neg_re = (s0n * z00n.real()).norm_sqr()
    neg_im = (s0n * z00n.imag()).norm_sqr()
    model = pos_re + pos_im + neg_re + neg_im

    nll = ld.NLL(manager, model, ds_data, ds_mc)
    status = nll.minimize([1.0] * len(nll.parameters))
    print(status)
    fit_weights = nll.project(status.x)
    s0p_weights = nll.project_with(status.x, ["z00p", "s0p"])
    s0n_weights = nll.project_with(status.x, ["z00n", "s0n"])
    d2p_weights = nll.project_with(status.x, ["z22p", "d2p"])
    masses_mc = res_mass.value_on(ds_mc)
    masses_data = res_mass.value_on(ds_data)
    weights_data = ds_data.weights
    plt.hist(masses_data, weights=weights_data, bins=80, range=(1.0, 2.0), label="Data", histtype="step")
    plt.hist(masses_mc, weights=fit_weights, bins=80, range=(1.0, 2.0), label="Fit", histtype="step")
    plt.hist(masses_mc, weights=s0p_weights, bins=80, range=(1.0, 2.0), label="$S_0^+$", histtype="step")
    plt.hist(masses_mc, weights=s0n_weights, bins=80, range=(1.0, 2.0), label="$S_0^-$", histtype="step")
    plt.hist(masses_mc, weights=d2p_weights, bins=80, range=(1.0, 2.0), label="$D_2^+$", histtype="step")
    plt.legend()
    plt.savefig("demo.svg")


if __name__ == "__main__":
    main()

This example would probably make the most sense for a binned fit, since there isn't actually any mass dependence in any of these amplitudes (so it will just plot the relative amount of each wave over the entire dataset).

Other Examples

You can find other Python examples in the python_examples folder. They should each have a corresponding requirements_[#].txt file.

Example 1

The first example script uses data generated with gen_amp. These data consist of a data file with two resonances, an $f_0(1500)$ modeled as a Breit-Wigner with a mass of $1506\text{ MeV}/c^2$ and a width of $112\text{ MeV}/c^2$ and an $f_2'(1525)$, also modeled as a Breit-Wigner, with a mass of $1517\text{ MeV}/c^2$ and a width of $86\text{ MeV}/c^2$, as per the PDG. These were generated to decay to pairs of $K_S^0$s and are produced via photoproduction off a proton target (as in the GlueX experiment). The beam photon is polarized with an angle of $0$ degrees relative to the production plane and a polarization magnitude of $0.3519$ (out of unity). The configuration file used to generate the corresponding data and Monte Carlo files can also be found in the python_examples, and the datasets contain $100,000$ data events and $1,000,000$ Monte Carlo events (generated with the -f argument to create a Monte Carlo file without resonances). The result of this fit can be seen in the following image (using the default 50 bins):

Data Format

The data format for laddu is a bit different from some of the alternatives like AmpTools. Since ROOT doesn't yet have bindings to Rust and projects to read ROOT files are still largely works in progress (although I hope to use oxyroot in the future when I can figure out a few bugs), the primary interface for data in laddu is Parquet files. These are easily accessible from almost any other language and they don't take up much more space than ROOT files. In the interest of future compatibility with any number of experimental setups, the data format consists of an arbitrary number of columns containing the four-momenta of each particle, the polarization vector of each particle (optional) and a single column for the weight. These columns all have standardized names. For example, the following columns would describe a dataset with four particles, the first of which is a polarized photon beam, as in the GlueX experiment:

Column name Data Type Interpretation
p4_0_E Float32 Beam Energy
p4_0_Px Float32 Beam Momentum (x-component)
p4_0_Py Float32 Beam Momentum (y-component)
p4_0_Pz Float32 Beam Momentum (z-component)
eps_0_x Float32 Beam Polarization (x-component)
eps_0_y Float32 Beam Polarization (y-component)
eps_0_z Float32 Beam Polarization (z-component)
p4_1_E Float32 Recoil Proton Energy
p4_1_Px Float32 Recoil Proton Momentum (x-component)
p4_1_Py Float32 Recoil Proton Momentum (y-component)
p4_1_Pz Float32 Recoil Proton Momentum (z-component)
p4_2_E Float32 Decay Product 1 Energy
p4_2_Px Float32 Decay Product 1 Momentum (x-component)
p4_2_Py Float32 Decay Product 1 Momentum (y-component)
p4_2_Pz Float32 Decay Product 1 Momentum (z-component)
p4_3_E Float32 Decay Product 2 Energy
p4_3_Px Float32 Decay Product 2 Momentum (x-component)
p4_3_Py Float32 Decay Product 2 Momentum (y-component)
p4_3_Pz Float32 Decay Product 2 Momentum (z-component)
weight Float32 Event Weight

To make it easier to get started, we can directly convert from the AmpTools format using the provided [amptools-to-laddu] script (see the bin directory of this repository). This is not bundled with the Python library (yet) but may be in the future.

Future Plans

  • Introduce Rust-side function minimization. My ganesh was written with this library in mind, and bindings will eventually be included to smooth over the fitting interface.
  • Allow users to build likelihood functions from multiple terms, including non-amplitude terms like LASSO.
  • Create a nice interface for binning datasets along a particular variable and fitting the binned data.
  • MPI and GPU integration (these are incredibly difficult to do right now, but it's something I'm looking into).
  • As always, more tests and documentation.

Alternatives

While this is likely the first Rust project (aside from my previous attempt, rustitude), there are several other amplitude analysis programs out there at time of writing. This library is a rewrite of rustitude which was written when I was just learning Rust and didn't have a firm grasp of a lot of the core concepts that are required to make the analysis pipeline memory- and CPU-efficient. In particular, rustitude worked well, but ate up a ton of memory and did not handle precalculation as nicely.

AmpTools

The main inspiration for this project is the library most of my collaboration uses, AmpTools. AmpTools has several advantages over laddu: it's probably faster for almost every use case, but this is mainly because it is fully integrated with MPI and GPU support. I'm not actually sure if there's a fair benchmark between the two libraries, but I'd wager AmpTools would still win. AmpTools is a much older, more developed project, dating back to 2010. However, it does have its disadvantages. First and foremost, the primary interaction with the library is through configuration files which are not really code and sort of represent a domain specific language. As such, there isn't really a way to check if a particular config will work before running it. Users could technically code up their analyses in C++ as well, but I think this would generally be more work for very little benefit. AmpTools primarily interacts with Minuit, so there aren't simple ways to perform alternative optimization algorithms, and the outputs are a file which must also be parsed by code written by the user. This usually means some boilerplate setup for each analysis, a slew of input and output files, and, since it doesn't ship with any amplitudes, integration with other libraries. The data format is also very rigid, to the point where including beam polarization information feels hacked on (see the Zlm implementation here which requires the event-by-event polarization to be stored in the beam's four-momentum). While there isn't an official Python interface, Lawrence Ng has made some progress porting the code here.

PyPWA

PyPWA is a library written in pure Python. While this might seem like an issue for performance (and it sort of is), the library has several features which encourage the use of JIT compilers. The upside is that analyses can be quickly prototyped and run with very few dependencies, it can even run on GPUs and use multiprocessing. The downside is that recent development has been slow and the actual implementation of common amplitudes is, in my opinion, messy. I don't think that's a reason to not use it, but it does make it difficult for new users to get started.

ComPWA

ComPWA is a newcomer to the field. It's also a pure Python implementation and is comprised of three separate libraries. QRules can be used to validate and generate particle reaction topologies using conservation rules. AmpForm uses SymPy to transform these topologies into mathematical expressions, and it can also simplify the mathematical forms through the built-in CAS of SymPy. Finally, TensorWaves connects AmpForm to various fitting methods. In general, these libraries have tons of neat features, are well-documented, and are really quite nice to use. I would like to eventually see laddu as a companion to ComPWA (rather than direct competition), but I don't really know enough about the libraries to say much more than that.

Others

It could be the case that I am leaving out software with which I am not familiar. If so, I'd love to include it here for reference. I don't think that laddu will ever be the end-all-be-all of amplitude analysis, just an alternative that might improve on existing systems. It is important for physicists to be aware of these alternatives. For example, if you really don't want to learn Rust but need to implement an amplitude which isn't already included here, laddu isn't for you, and one of these alternatives might be best.

[^1]: Mathieu, V., Albaladejo, M., Fernández-Ramírez, C., Jackura, A. W., Mikhasenko, M., Pilloni, A., & Szczepaniak, A. P. (2019). Moments of angular distribution and beam asymmetries in $\eta\pi^0$ photoproduction at GlueX. Physical Review D, 100(5). doi:10.1103/physrevd.100.054017

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

laddu-0.1.10.tar.gz (1.6 MB view details)

Uploaded Source

Built Distributions

laddu-0.1.10-cp37-abi3-win_amd64.whl (2.8 MB view details)

Uploaded CPython 3.7+ Windows x86-64

laddu-0.1.10-cp37-abi3-musllinux_1_2_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.7+ musllinux: musl 1.2+ x86-64

laddu-0.1.10-cp37-abi3-musllinux_1_2_i686.whl (3.5 MB view details)

Uploaded CPython 3.7+ musllinux: musl 1.2+ i686

laddu-0.1.10-cp37-abi3-musllinux_1_2_armv7l.whl (3.7 MB view details)

Uploaded CPython 3.7+ musllinux: musl 1.2+ ARMv7l

laddu-0.1.10-cp37-abi3-musllinux_1_2_aarch64.whl (3.2 MB view details)

Uploaded CPython 3.7+ musllinux: musl 1.2+ ARM64

laddu-0.1.10-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

laddu-0.1.10-cp37-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl (4.5 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ s390x

laddu-0.1.10-cp37-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (3.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ppc64le

laddu-0.1.10-cp37-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARMv7l

laddu-0.1.10-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.0 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

laddu-0.1.10-cp37-abi3-manylinux_2_12_i686.manylinux2010_i686.whl (3.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.12+ i686

laddu-0.1.10-cp37-abi3-macosx_11_0_arm64.whl (2.7 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

laddu-0.1.10-cp37-abi3-macosx_10_12_x86_64.whl (3.1 MB view details)

Uploaded CPython 3.7+ macOS 10.12+ x86-64

File details

Details for the file laddu-0.1.10.tar.gz.

File metadata

  • Download URL: laddu-0.1.10.tar.gz
  • Upload date:
  • Size: 1.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.7.4

File hashes

Hashes for laddu-0.1.10.tar.gz
Algorithm Hash digest
SHA256 df9be69b8236ab6a081a00d3c4d172d8ffec67b5a596370909c31d4c48159b4a
MD5 b656aacdba079b03f4f28c7c6b1a6a62
BLAKE2b-256 8dc84748781c66c7a391243c926650b945e9ad1e3c9828cedb4c54d8480cae69

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: laddu-0.1.10-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.7.4

File hashes

Hashes for laddu-0.1.10-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 f862cd3fc55ea424a888b79674f0393937d1605a935c96952fa890035619a928
MD5 07fdc1d2e7ef573d35e275d20367484e
BLAKE2b-256 730122b74e15fc45c0716b8799618bd55adb1cd42001eae8517127b4bfe5bed9

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 d282b0e75e7bbd103f3390064c52d6c2684e76f57d21808f4c01dee6e2cfc73a
MD5 92a8fdac68ec64a4e0aea1c160cac564
BLAKE2b-256 8dd31830811fc1af250b866c9644c8e96b68d0c9fdcb7e083bbcd55f71bed632

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 d57fe1770d2ea1b8209f15170fdaee45ada5ef389caef4633f6a8c004b76b071
MD5 0e1b09f84bd990e7a40a0cf21896d10f
BLAKE2b-256 b4765e824f53a92860da84862d0c699e62e1c5c98d93b9234eaaaabb5402c547

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 e0100f93b0c1cdb596f79bbe073a82aed5bbd537acf7d6adc5c97981a2722584
MD5 1f50c49f5d62892012a3c90564c5331b
BLAKE2b-256 649f085b79096cfc0524f729c094fcc6f068a89bae4dd66ccae8fcc4fb554355

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 b6d63fd32a956e65d9fc5a9d22f2525e4d6b70d925da092b635dbc7e298ad8ec
MD5 d6d67e9753498d97f4e704b8d1576522
BLAKE2b-256 41df704e72c49230914e28b7f0f277a95579959f423ffa870b459698fe0a10f7

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 108302ea7111a3e66da6f3b7e89c3b6ddedd2e514d0f36ee6ae7957d98be188c
MD5 c635b359652b53e606fc08371b2738e9
BLAKE2b-256 2a0ed6be884a56c85e255dfc1b886ddb61a2a9301511595b41e28555df277934

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl
Algorithm Hash digest
SHA256 25c14d8c5c507589f92030c02085bb4379de6a4ac5c1b0139fb508dddc6e632b
MD5 af64b142827a43e0d935ac410354b44b
BLAKE2b-256 8293798216e090b58f29d2b847f9119ad724e642830303715b2df1cbf84bc0ae

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 da405c9461aa0306e5a081601c814f361a47a4c55bfc1175b4fa4ea220e2db62
MD5 86d4ea89d7d9fcd4c6a6d969a076e5e8
BLAKE2b-256 93206730735323355ffe66b02c287f43ad3f13c548eefcce9c82923fead3577e

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 a0012308f50e79ff36e7deb62ac6bca2fc32f80ca46e8b42a320edecf0771aa6
MD5 9447d53b2fecc1df1f6a47b8dd9ad092
BLAKE2b-256 09d8bdf5e647ca907562562949ad2395b6427cef061dbbfe183a83b838e7dae3

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 cd50bdb61aafbf5637a827fb49b416a3f50b655712ff819298a97bcf77ad0043
MD5 244dddf9836b4b834e985de89537a51d
BLAKE2b-256 9b72838d6dcd29f4f090118ce0a6dac66bafd044a8eb5303e6adf455ff6951c1

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 fcb593676a8975d3fb7d71cb0b60817ed0a6cb515298f368769bcf36c1c7726d
MD5 a134c4213086ab81c0db09c25737d422
BLAKE2b-256 66c9ff4c02a5b6f47e4035a2b34fa1a02e3c9c29bc2e138609a33ec279c48dd6

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 0c7695b5f20576994dadd6315b2fe16977b429813fc69fee41cde2d8004d182f
MD5 4ec28169ae795d9e3751dda890217f36
BLAKE2b-256 be9858d113d5ca17c8388ad43baffd1de8f90dad27c08e5a0c22e5a6c019422a

See more details on using hashes here.

File details

Details for the file laddu-0.1.10-cp37-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for laddu-0.1.10-cp37-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 6a744a4cf13cdb4a27d59d5dc8d497a647b313c168db969ebdec1dfe2b2e7052
MD5 3ebee1c415c88ac3a5b4d290748007f2
BLAKE2b-256 0c3847aeac91e52df9cd362ba9b43ee80f952df7cd002211da08dc0614135815

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page