Skip to main content

Rust and WebAssembly library for late interaction models.

Project description

pylate-rs

blog crate
Efficient Inference for PyLate

 

⭐️ Overview

pylate-rs is a high-performance inference engine for PyLate models, meticulously crafted in Rust for optimal speed and efficiency.

While model training is handled by PyLate, which supports a variety of late interaction models, pylate-rs is engineered to execute these models at speeds.

  • Accelerated Performance: Experience significantly faster model loading and rapid cold starts, making it ideal for serverless environments and low-latency applications.

  • Lightweight Design: Built on the Candle ML framework, pylate-rs maintains a minimal footprint suitable for resource-constrained systems like serverless functions and edge computing.

  • Broad Hardware Support: Optimized for diverse hardware, with dedicated builds for standard CPUs, Intel (MKL), Apple Silicon (Accelerate & Metal), and NVIDIA GPUs (CUDA).

  • Cross-Platform Integration: Seamlessly integrate pylate-rs into your projects with bindings for Python, Rust, and JavaScript/WebAssembly.

For a complete, high-performance multi-vector search pipeline, pair pylate-rs with its companion library, FastPlaid, at inference time.

Explore our WebAssembly live demo.

 

💻 Installation

Install the version of pylate-rs that matches your hardware for optimal performance.

Python

Target Hardware Installation Command
Standard CPU pip install pylate-rs
Apple CPU (macOS) pip install pylate-rs-accelerate
Intel CPU (MKL) pip install pylate-rs-mkl
Apple GPU (M1/M2/M3) pip install pylate-rs-metal

Python GPU support

To install pylate-rs with GPU support, please built it from source using the following command:

pip install git+https://github.com/lightonai/pylate-rs.git

or by cloning the repository and installing it locally:

git clone https://github.com/lightonai/pylate-rs.git
cd pylate-rs
pip install .

Any help to pre-build and disribute the CUDA wheels would be greatly appreciated.

 

Rust

Add pylate-rs to your Cargo.toml by enabling the feature flag that corresponds to your backend.

Feature Target Hardware Installation Command
(default) Standard CPU cargo add pylate-rs
accelerate Apple CPU (macOS) cargo add pylate-rs --features accelerate
mkl Intel CPU (MKL) cargo add pylate-rs --features mkl
metal Apple GPU (M1/M2/M3) cargo add pylate-rs --features metal
cuda NVIDIA GPU (CUDA) cargo add pylate-rs --features cuda

 

⚡️ Quick Start

Python

Get started in just a few lines of Python.

from pylate_rs import models

# Initialize the model for your target device ("cpu", "cuda", or "mps")
model = models.ColBERT(
    model_name_or_path="lightonai/GTE-ModernColBERT-v1",
    device="cuda"
)

# Encode queries and documents
queries_embeddings = model.encode(
    sentences=["What is the capital of France?", "How big is the sun?"],
    is_query=True
)

documents_embeddings = model.encode(
    sentences=["Paris is the capital of France.", "The sun is a star."],
    is_query=False
)

# Calculate similarity scores
similarities = model.similarity(queries_embeddings, documents_embeddings)

print(f"Similarity scores:\n{similarities}")

# Use hierarchical pooling to reduce document embedding size and speed up downstream tasks
pooled_documents_embeddings = model.encode(
    sentences=["Paris is the capital of France.", "The sun is a star."],
    is_query=False,
    pool_factor=2, # Halves the number of token embeddings
)

similarities_pooled = model.similarity(queries_embeddings, pooled_documents_embeddings)

print(f"Similarity scores with pooling:\n{similarities_pooled}")

 

Rust

use anyhow::Result;
use candle_core::Device;
use pylate_rs::{hierarchical_pooling, ColBERT};

fn main() -> Result<()> {
    // Set the device (e.g., Cpu, Cuda, Metal)
    let device = Device::Cpu;

    // Initialize the model
    let mut model: ColBERT = ColBERT::from("lightonai/GTE-ModernColBERT-v1")
        .with_device(device)
        .try_into()?;

    // Encode queries and documents
    let queries = vec!["What is the capital of France?".to_string()];
    let documents = vec!["Paris is the capital of France.".to_string()];

    let query_embeddings = model.encode(&queries, true)?;
    let document_embeddings = model.encode(&documents, false)?;

    // Calculate similarity
    let similarities = model.similarity(&query_embeddings, &document_embeddings)?;
    println!("Similarity score: {}", similarities.data[0][0]);

    // Use hierarchical pooling
    let pooled_document_embeddings = hierarchical_pooling(&document_embeddings, 2)?;
    let pooled_similarities = model.similarity(&query_embeddings, &pooled_document_embeddings)?;
    println!("Similarity score after hierarchical pooling: {}", pooled_similarities.data[0][0]);

    Ok(())
}

 

📊 Benchmarks

Device    backend        Queries per seconds        Documents per seconds        Model loading time
cpu       PyLate         350.10                     32.16                        2.06
cpu       pylate-rs      386.21 (+10%)              42.15 (+31%)                 0.07 (-97%)

cuda      PyLate         2236.48                    882.66                       3.62
cuda      pylate-rs      4046.88 (+81%)             976.23 (+11%)                1.95 (-46%)

mps       PyLate         580.81                     103.10                       1.95
mps       pylate-rs      291.71 (-50%)              23.26 (-77%)                 0.08 (-96%)

Benchmark were run with Python. pylate-rs provide significant performance improvement, especially in scenarios requiring fast startup times. While on a Mac it takes up to 5 seconds to load a model with the Transformers backend and encode a single query, pylate-rs achieves this in just 0.11 seconds, making it ideal for low-latency applications. Don't expect pylate-rs to be much faster than PyLate to encode a lot of content at the same time as PyTorch is heavily optimized.

 

📦 Using Custom Models

pylate-rs is compatible with any model saved in the PyLate format, whether from the Hugging Face Hub or a local directory. PyLate itself is compatible with a wide range of models, including those from Sentence Transformers, Hugging Face Transformers, and custom models. So before using pylate-rs, ensure your model is saved in the PyLate format. You can easily convert and upload your own models using PyLate.

Pushing a model to the Hugging Face Hub in PyLate format is straightforward. Here’s how you can do it:

pip install pylate

Then, you can use the following Python code snippet to push your model:

from pylate import models

# Load your model
model = models.ColBERT(model_name_or_path="your-base-model-on-hf")

# Push in PyLate format
model.push_to_hub(
    repo_id="YourUsername/YourModelName",
    private=False,
    token="YOUR_HUGGINGFACE_TOKEN",
)

If you want to save a model in PyLate format locally, you can do so with the following code snippet:

from pylate import models

# Load your model
model = models.ColBERT(model_name_or_path="your-base-model-on-hf")

# Save in PyLate format
model.save_pretrained("path/to/save/GTE-ModernColBERT-v1-pylate")

An existing set of models compatible with pylate-rs is available on the Hugging Face Hub under the LightOn namespace.

 

Retrieval pipeline

pip install pylate-rs fast-plaid

Here is a sample code for running ColBERT with pylate-rs and fast-plaid.

import torch
from fast_plaid import search
from pylate_rs import models

model = models.ColBERT(
    model_name_or_path="lightonai/GTE-ModernColBERT-v1",
    device="cpu", # mps or cuda
)

documents = [
    "1st Arrondissement: Louvre, Tuileries Garden, Palais Royal, historic, tourist.",
    "2nd Arrondissement: Bourse, financial, covered passages, Sentier, business.",
    "3rd Arrondissement: Marais, Musée Picasso, galleries, trendy, historic.",
    "4th Arrondissement: Notre-Dame, Marais, Hôtel de Ville, LGBTQ+.",
    "5th Arrondissement: Latin Quarter, Sorbonne, Panthéon, student, intellectual.",
    "6th Arrondissement: Saint-Germain-des-Prés, Luxembourg Gardens, chic, artistic, cafés.",
    "7th Arrondissement: Eiffel Tower, Musée d'Orsay, Les Invalides, affluent, prestigious.",
    "8th Arrondissement: Champs-Élysées, Arc de Triomphe, luxury, shopping, Élysée.",
    "9th Arrondissement: Palais Garnier, department stores, shopping, theaters.",
    "10th Arrondissement: Gare du Nord, Gare de l'Est, Canal Saint-Martin.",
    "11th Arrondissement: Bastille, nightlife, Oberkampf, revolutionary, hip.",
    "12th Arrondissement: Bois de Vincennes, Opéra Bastille, Bercy, residential.",
    "13th Arrondissement: Chinatown, Bibliothèque Nationale, modern, diverse, street-art.",
    "14th Arrondissement: Montparnasse, Catacombs, residential, artistic, quiet.",
    "15th Arrondissement: Residential, family, populous, Parc André Citroën.",
    "16th Arrondissement: Trocadéro, Bois de Boulogne, affluent, elegant, embassies.",
    "17th Arrondissement: Diverse, Palais des Congrès, residential, Batignolles.",
    "18th Arrondissement: Montmartre, Sacré-Cœur, Moulin Rouge, artistic, historic.",
    "19th Arrondissement: Parc de la Villette, Cité des Sciences, canals, diverse.",
    "20th Arrondissement: Père Lachaise, Belleville, cosmopolitan, artistic, historic.",
]

# Encoding documents
documents_embeddings = model.encode(
    sentences=documents,
    is_query=False,
    pool_factor=2, # Let's divide the number of embeddings by 2.
)

# Creating the FastPlaid index
fast_plaid = search.FastPlaid(index="index")


fast_plaid.create(
    documents_embeddings=[torch.tensor(embedding) for embedding in documents_embeddings]
)

We can then load the existing index and search for the most relevant documents:

import torch
from fast_plaid import search
from pylate_rs import models

fast_plaid = search.FastPlaid(index="index")

queries = [
    "arrondissement with the Eiffel Tower and Musée d'Orsay",
    "Latin Quarter and Sorbonne University",
    "arrondissement with Sacré-Cœur and Moulin Rouge",
    "arrondissement with the Louvre and Tuileries Garden",
    "arrondissement with Notre-Dame Cathedral and the Marais",
]

queries_embeddings = model.encode(
    sentences=queries,
    is_query=True,
)

scores = fast_plaid.search(
    queries_embeddings=torch.tensor(queries_embeddings),
    top_k=3,
)

print(scores)

📝 Citation

If you use pylate-rs in your research or project, please cite it as follows:

@misc{PyLate,
  title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
  author={Chaffin, Antoine and Sourty, Raphaël},
  url={https://github.com/lightonai/pylate},
  year={2024}
}

 

WebAssembly

For JavaScript and TypeScript projects, install the WASM package from npm.

npm install pylate-rs

Load the model by fetching the required files from a local path or the Hugging Face Hub.

import { ColBERT } from "pylate-rs";

const REQUIRED_FILES = [
  "tokenizer.json",
  "model.safetensors",
  "config.json",
  "config_sentence_transformers.json",
  "1_Dense/model.safetensors",
  "1_Dense/config.json",
  "special_tokens_map.json",
];

async function loadModel(modelRepo) {
  const fetchAllFiles = async (basePath) => {
    const responses = await Promise.all(
      REQUIRED_FILES.map((file) => fetch(`${basePath}/${file}`))
    );
    for (const response of responses) {
      if (!response.ok) throw new Error(`File not found: ${response.url}`);
    }
    return Promise.all(
      responses.map((res) => res.arrayBuffer().then((b) => new Uint8Array(b)))
    );
  };

  try {
    let modelFiles;
    try {
      // Attempt to load from a local `models` directory first
      modelFiles = await fetchAllFiles(`models/${modelRepo}`);
    } catch (e) {
      console.warn(
        `Local model not found, falling back to Hugging Face Hub.`,
        e
      );
      // Fallback to fetching directly from the Hugging Face Hub
      modelFiles = await fetchAllFiles(
        `https://huggingface.co/${modelRepo}/resolve/main`
      );
    }

    const [
      tokenizer,
      model,
      config,
      stConfig,
      dense,
      denseConfig,
      tokensConfig,
    ] = modelFiles;

    // Instantiate the model with the loaded files
    const colbertModel = new ColBERT(
      model,
      dense,
      tokenizer,
      config,
      stConfig,
      denseConfig,
      tokensConfig,
      32
    );

    // You can now use `colbertModel` for encoding
    console.log("Model loaded successfully!");
    return colbertModel;
  } catch (error) {
    console.error("Model Loading Error:", error);
  }
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

pylate_rs-1.0.3-cp313-cp313-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.13Windows x86-64

pylate_rs-1.0.3-cp313-cp313-manylinux_2_39_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ ARM64

pylate_rs-1.0.3-cp313-cp313-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

pylate_rs-1.0.3-cp312-cp312-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.12Windows x86-64

pylate_rs-1.0.3-cp312-cp312-manylinux_2_39_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ ARM64

pylate_rs-1.0.3-cp312-cp312-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

pylate_rs-1.0.3-cp311-cp311-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.11Windows x86-64

pylate_rs-1.0.3-cp311-cp311-manylinux_2_39_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

pylate_rs-1.0.3-cp311-cp311-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

pylate_rs-1.0.3-cp310-cp310-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.10Windows x86-64

pylate_rs-1.0.3-cp310-cp310-manylinux_2_39_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

pylate_rs-1.0.3-cp310-cp310-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

pylate_rs-1.0.3-cp39-cp39-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.9Windows x86-64

pylate_rs-1.0.3-cp39-cp39-manylinux_2_39_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ ARM64

pylate_rs-1.0.3-cp39-cp39-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

File details

Details for the file pylate_rs-1.0.3-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.3-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.3-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 526fba9c60a6aadbcd9d111576955d269b31c1d48715d219852fd82d8a009707
MD5 54466d8d10de3106957c724e36b73f52
BLAKE2b-256 6b38ec1b27cd65eb39ef3cfacdbd455c2bbcb1416ed56a3ffdf963795fe4a867

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp313-cp313-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp313-cp313-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 c3fcfe72876df37e3281ca6d2c530d53b3c6bfc82cb352ea41c573182766676f
MD5 e8ad16756f1c085b4269970574e749d3
BLAKE2b-256 35872715194196bc9f76e4b24e20a710bb160e96e61de30302e090d2ca0a4399

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 1c58568625b61912317ecbb5742b6eed51aeb587b5b5b4a7890ea96fe262f285
MD5 2135d94bec0a79f004e2d5f184d9051e
BLAKE2b-256 6936e8ecfc00808643f312db9f30743e4cf4c46f47819f45deddd1c0dcdc55cb

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6d188c2b0abc326ee76365407f50378b031ee68b6bf295f2a9530b5cef3bd966
MD5 053b9bbbf68355d6a1459d89f471e7b4
BLAKE2b-256 85070724e2afe058e4c2d927a47d7ee484676c43321576ae203d0588752ee6d6

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.3-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.3-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 6b8e358de941285f860cfd9d07ef31670627ebd8eef17011e48934f3832f0ad6
MD5 7aa60df67dda445df9b4e5476bc3c1c3
BLAKE2b-256 9e055c5fe4645e8d7ef5ff563b1a78e6e1fdfd6d2050706cc9a82220068ef214

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp312-cp312-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp312-cp312-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 df97a897a918ab88c78815334a429d5ad04bd3523ded9b9ec5e5c2fcf308b53c
MD5 b04f3300a23bf5dc831440d061abbe46
BLAKE2b-256 5c0b474705fa1c033d6689a053524ce71d6c653764a662daa899b6aacfa89284

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 40effb22904053d36253bf9c450964744c57148741bf492911e8962a2b9d1eac
MD5 60816739752d78f644cae593c8899745
BLAKE2b-256 e8353bf283b61f8c348f5d5025643932d8f0938443aa546e4a850efc018ea965

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 4614778c1b5412f57f513edc7c67ad45a0f80b9a3487fb76a0eb97c453001101
MD5 d90d363a7ef1e58c4c764602114d69cc
BLAKE2b-256 4546a221c2db3434827bdfe27e453a9dff95b7d6fb98f01f1beb27eb2975fcab

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.3-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.3-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 b8c9f3735fabb3e4bdd175d3d852a795835637224e3efda75b3acae276527fc8
MD5 b362078fb31ef8d470f4e3e4c35716e1
BLAKE2b-256 d7ccafe87b5f56aac3a631fec615d4d8caecae5c7eafb6b5e5b42b385c83b320

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp311-cp311-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp311-cp311-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 2d30c719f50fda5a350221a135e040305d87f684d81d3ca07917cea4d899b29f
MD5 b296950125493cc29dbe876509ba4a3f
BLAKE2b-256 9b0b86c990f1a7cd08bfbef08e26aa51e5ff62f0042a715d24ce74e70004676e

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 8c53f7dbaf38485d4642f8a3f32e3bb65fcc24224c44f9deeb477bf0fc1d0d6f
MD5 7990ef62bc7cfa559d3e696aa69daa9f
BLAKE2b-256 6b0e371261025dd8d94fdc7abc3b63d066ee310e90a4630d67943aba7f3294c3

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 3f4f33d4b99e862c60d16b781758e3fa1fa36ca314a56606608df6d07b724433
MD5 af0c5b5ad3151a22caa3dfa9b2ba73c4
BLAKE2b-256 513b3a1e694c76df289b922f73a417a19121682b5ba4f437e9fc009050bea4cb

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.3-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.3-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 b3b10711e44809e65ef9d10d8c959f2beeb9160befb654dc49cf0dae702cf31f
MD5 48103a3a72efc2a750059dc03638843e
BLAKE2b-256 49d62ef2e5f2b62604fa0441f34bf0e87be09cc3efa4c9a558b14d721292cb92

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp310-cp310-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp310-cp310-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 7488dee234076cca5574b0b65e47ac6cf8b3566adb06cd3923966499a0fcb506
MD5 92aa85aafbd45fd4dd23527f433abc9e
BLAKE2b-256 cf0a3d1ba3d7bdc5a8767373d7c2c6c672dc16d279ac94625ff478e03b60422a

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 46f92598bb51d10d1f16e78117100e4984ce340c5f8af7961deeeb7f1b3cc937
MD5 cf3c858e9ab7d61df6114dac7e369afe
BLAKE2b-256 03ad60660eb35bb5942a071a9e3e4f80064e9950fc9de99643f6fbda82887e91

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1a7cc88aa212d233d009a8d710f664fd34649531a0d9f2b04d3e696352791cf2
MD5 de73a51b4ba1c030f18b21e9798fe5c6
BLAKE2b-256 ad6242d745a8c443a01649a6734b94a6880665103e4075bd77bbb85a885688e3

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.3-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.3-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 16fba6bd7bc4f7daa6824febf296687255a58b8093fedd8502da08150c58f5e9
MD5 726fc19e813a9cc2371c412832a3fd32
BLAKE2b-256 9f95f3575848d856ec9660afe9a1933fe6abb3f40c9790c1f999c209d43e2d3f

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp39-cp39-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp39-cp39-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 082f3a712125cb919bedd95362a6ca91fcb51c9e3b2e8c7e04e84b84d699d2a1
MD5 03af5855b1f1aa71c17e2f9a20ee0d94
BLAKE2b-256 3c22183a05a287612d9afac85b4427d936d56bdf59d78fe7cd53e31832d4b88d

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 870c0a230be46532648281c6c9c90beb22d3b69175222ebfc843b07ed57bb8dc
MD5 04b6549f70958e6d63c9c1e3d8ac2f38
BLAKE2b-256 77de2afa175beeba71092e976c977c03499f43f9cc6e82e8208d61eacf04fb33

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.3-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.3-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a2327ff2ba4892a2776277f717e01092a9cadd5fdca83517259571a5a2703452
MD5 aace7dfdd3dd3e327c6023f7b6d9ae69
BLAKE2b-256 5da77b752136caf5fa2d915b22803f944a2725b9c4f2f7300b51238efcad2f36

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page