Skip to main content

Rust and WebAssembly library for late interaction models.

Project description

pylate-rs

blog
Efficient Inference for PyLate

 

⭐️ Overview

pylate-rs is a high-performance inference engine for PyLate models, built in Rust for speed and efficiency. pylate-rs does not provide training capabilities, but it is designed to be used with models trained using PyLate, which supports a wide range of late interaction models like ColBERT, GTE, and more.

  • Accelerated Performance: Significantly reduces model loading times, enabling fast cold starts in serverless environments and low-latency applications.

  • Lightweight Design: Built with the Candle ML framework, pylate-rs has a minimal resource footprint ideal for serverless functions, edge computing, and other resource-constrained systems.

  • Broad Hardware Support: Optimized for a range of hardware configurations, with dedicated builds for standard CPUs, Intel (MKL), Apple Silicon (Accelerate & Metal), and NVIDIA GPUs (CUDA).

  • Cross-Platform Integration: Bindings for use in Python, Rust, and JavaScript/WebAssembly projects.

PyLate is the best in town tool to train late interaction models. Then pylate-rs can be paired with its companion library FastPlaid at inference time for a high-performance multi-vector search pipeline.

A WebAssembly demo is available online with insights about pylate-rs.

 

💻 Installation

Install the version of pylate-rs that matches your hardware for optimal performance.

Python

Target Hardware Installation Command
Standard CPU pip install pylate-rs
Apple CPU (macOS) pip install pylate-rs-accelerate
Intel CPU (MKL) pip install pylate-rs-mkl
Apple GPU (M1/M2/M3) pip install pylate-rs-metal
NVIDIA GPU (CUDA) pip install pylate-rs-cuda

 

Rust

Add pylate-rs to your Cargo.toml by enabling the feature flag that corresponds to your backend.

Feature Target Hardware Installation Command
(default) Standard CPU cargo add pylate-rs
accelerate Apple CPU (macOS) cargo add pylate-rs --features accelerate
mkl Intel CPU (MKL) cargo add pylate-rs --features mkl
metal Apple GPU (M1/M2/M3) cargo add pylate-rs --features metal
cuda NVIDIA GPU (CUDA) cargo add pylate-rs --features cuda

 

⚡️ Quick Start

Python

Get started in just a few lines of Python.

from pylate_rs import models

# Initialize the model for your target device ("cpu", "cuda", or "mps")
model = models.ColBERT(
    model_name_or_path="lightonai/GTE-ModernColBERT-v1",
    device="cpu"
)

# Encode queries and documents
queries_embeddings = model.encode(
    sentences=["What is the capital of France?", "How big is the sun?"],
    is_query=True
)

documents_embeddings = model.encode(
    sentences=["Paris is the capital of France.", "The sun is a star."],
    is_query=False
)

# Calculate similarity scores
similarities = model.similarity(queries_embeddings, documents_embeddings)

print(f"Similarity scores:\n{similarities}")

# Use hierarchical pooling to reduce document embedding size and speed up downstream tasks
pooled_documents_embeddings = model.encode(
    sentences=["Paris is the capital of France.", "The sun is a star."],
    is_query=False,
    pool_factor=2, # Halves the number of token embeddings
)

similarities_pooled = model.similarity(queries_embeddings, pooled_documents_embeddings)

print(f"Similarity scores with pooling:\n{similarities_pooled}")

 

Rust

use anyhow::Result;
use candle_core::Device;
use pylate_rs::{hierarchical_pooling, ColBERT};

fn main() -> Result<()> {
    // Set the device (e.g., Cpu, Cuda, Metal)
    let device = Device::Cpu;

    // Initialize the model
    let mut model: ColBERT = ColBERT::from("lightonai/GTE-ModernColBERT-v1")
        .with_device(device)
        .try_into()?;

    // Encode queries and documents
    let queries = vec!["What is the capital of France?".to_string()];
    let documents = vec!["Paris is the capital of France.".to_string()];

    let query_embeddings = model.encode(&queries, true)?;
    let document_embeddings = model.encode(&documents, false)?;

    // Calculate similarity
    let similarities = model.similarity(&query_embeddings, &document_embeddings)?;
    println!("Similarity score: {}", similarities.data[0][0]);

    // Use hierarchical pooling
    let pooled_document_embeddings = hierarchical_pooling(&document_embeddings, 2)?;
    let pooled_similarities = model.similarity(&query_embeddings, &pooled_document_embeddings)?;
    println!("Similarity score after hierarchical pooling: {}", pooled_similarities.data[0][0]);

    Ok(())
}

 

📊 Benchmarks

Device    backend        Queries per seconds        Documents per seconds        Model loading time
cpu       PyLate         350.10                     32.16                        2.06
cpu       pylate-rs      386.21 (+10%)              42.15 (+31%)                 0.07 (-97%)

cuda      PyLate         2236.48                    882.66                       3.62
cuda      pylate-rs      4046.88 (+81%)             976.23 (+11%)                1.95 (-46%)

mps       PyLate         580.81                     103.10                       1.95
mps       pylate-rs      291.71 (-50%)              23.26 (-77%)                 0.08 (-96%)

Benchmark were run with Python. pylate-rs provide significant performance improvement, especially in scenarios requiring fast startup times. While on a Mac it takes up to 5 seconds to load a model with the Transformers backend and encode a single query, pylate-rs achieves this in just 0.11 seconds, making it ideal for low-latency applications. Don't expect pylate-rs to be much faster than PyLate to encode a lot of content at the same time as PyTorch is heavily optimized.

 

📦 Using Custom Models

pylate-rs is compatible with any model saved in the PyLate format, whether from the Hugging Face Hub or a local directory. PyLate itself is compatible with a wide range of models, including those from Sentence Transformers, Hugging Face Transformers, and custom models. So before using pylate-rs, ensure your model is saved in the PyLate format. You can easily convert and upload your own models using PyLate.

Pushing a model to the Hugging Face Hub in PyLate format is straightforward. Here’s how you can do it:

pip install pylate

Then, you can use the following Python code snippet to push your model:

from pylate import models

# Load your model
model = models.ColBERT(model_name_or_path="your-base-model-on-hf")

# Push in PyLate format
model.push_to_hub(
    repo_id="YourUsername/YourModelName",
    private=False,
    token="YOUR_HUGGINGFACE_TOKEN",
)

If you want to save a model in PyLate format locally, you can do so with the following code snippet:

from pylate import models

# Load your model
model = models.ColBERT(model_name_or_path="your-base-model-on-hf")

# Save in PyLate format
model.save_pretrained("path/to/save/GTE-ModernColBERT-v1-pylate")

An existing set of models compatible with pylate-rs is available on the Hugging Face Hub under the LightOn namespace.

 

Retrieval pipeline

pip install pylate-rs fast-plaid

Here is a sample code for running ColBERT with pylate-rs and fast-plaid.

import torch
from fast_plaid import search
from pylate_rs import models

model = models.ColBERT(
    model_name_or_path="lightonai/GTE-ModernColBERT-v1",
    device="cpu", # mps or cuda
)

documents = [
    "1st Arrondissement: Louvre, Tuileries Garden, Palais Royal, historic, tourist.",
    "2nd Arrondissement: Bourse, financial, covered passages, Sentier, business.",
    "3rd Arrondissement: Marais, Musée Picasso, galleries, trendy, historic.",
    "4th Arrondissement: Notre-Dame, Marais, Hôtel de Ville, LGBTQ+.",
    "5th Arrondissement: Latin Quarter, Sorbonne, Panthéon, student, intellectual.",
    "6th Arrondissement: Saint-Germain-des-Prés, Luxembourg Gardens, chic, artistic, cafés.",
    "7th Arrondissement: Eiffel Tower, Musée d'Orsay, Les Invalides, affluent, prestigious.",
    "8th Arrondissement: Champs-Élysées, Arc de Triomphe, luxury, shopping, Élysée.",
    "9th Arrondissement: Palais Garnier, department stores, shopping, theaters.",
    "10th Arrondissement: Gare du Nord, Gare de l'Est, Canal Saint-Martin.",
    "11th Arrondissement: Bastille, nightlife, Oberkampf, revolutionary, hip.",
    "12th Arrondissement: Bois de Vincennes, Opéra Bastille, Bercy, residential.",
    "13th Arrondissement: Chinatown, Bibliothèque Nationale, modern, diverse, street-art.",
    "14th Arrondissement: Montparnasse, Catacombs, residential, artistic, quiet.",
    "15th Arrondissement: Residential, family, populous, Parc André Citroën.",
    "16th Arrondissement: Trocadéro, Bois de Boulogne, affluent, elegant, embassies.",
    "17th Arrondissement: Diverse, Palais des Congrès, residential, Batignolles.",
    "18th Arrondissement: Montmartre, Sacré-Cœur, Moulin Rouge, artistic, historic.",
    "19th Arrondissement: Parc de la Villette, Cité des Sciences, canals, diverse.",
    "20th Arrondissement: Père Lachaise, Belleville, cosmopolitan, artistic, historic.",
]

# Encoding documents
documents_embeddings = model.encode(
    sentences=documents,
    is_query=False,
    pool_factor=2, # Let's divide the number of embeddings by 2.
)

# Creating the FastPlaid index
fast_plaid = search.FastPlaid(index="index")


fast_plaid.create(
    documents_embeddings=[torch.tensor(embedding) for embedding in documents_embeddings]
)

We can then load the existing index and search for the most relevant documents:

import torch
from fast_plaid import search
from pylate_rs import models

fast_plaid = search.FastPlaid(index="index")

queries = [
    "arrondissement with the Eiffel Tower and Musée d'Orsay",
    "Latin Quarter and Sorbonne University",
    "arrondissement with Sacré-Cœur and Moulin Rouge",
    "arrondissement with the Louvre and Tuileries Garden",
    "arrondissement with Notre-Dame Cathedral and the Marais",
]

queries_embeddings = model.encode(
    sentences=queries,
    is_query=True,
)

scores = fast_plaid.search(
    queries_embeddings=torch.tensor(queries_embeddings),
    top_k=3,
)

print(scores)

📝 Citation

If you use pylate-rs in your research or project, please cite it as follows:

@misc{PyLate,
  title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
  author={Chaffin, Antoine and Sourty, Raphaël},
  url={https://github.com/lightonai/pylate},
  year={2024}
}

 

WebAssembly

For JavaScript and TypeScript projects, install the WASM package from npm.

npm install pylate-rs

Load the model by fetching the required files from a local path or the Hugging Face Hub.

import { ColBERT } from "pylate-rs";

const REQUIRED_FILES = [
  "tokenizer.json",
  "model.safetensors",
  "config.json",
  "config_sentence_transformers.json",
  "1_Dense/model.safetensors",
  "1_Dense/config.json",
  "special_tokens_map.json",
];

async function loadModel(modelRepo) {
  const fetchAllFiles = async (basePath) => {
    const responses = await Promise.all(
      REQUIRED_FILES.map((file) => fetch(`${basePath}/${file}`))
    );
    for (const response of responses) {
      if (!response.ok) throw new Error(`File not found: ${response.url}`);
    }
    return Promise.all(
      responses.map((res) => res.arrayBuffer().then((b) => new Uint8Array(b)))
    );
  };

  try {
    let modelFiles;
    try {
      // Attempt to load from a local `models` directory first
      modelFiles = await fetchAllFiles(`models/${modelRepo}`);
    } catch (e) {
      console.warn(
        `Local model not found, falling back to Hugging Face Hub.`,
        e
      );
      // Fallback to fetching directly from the Hugging Face Hub
      modelFiles = await fetchAllFiles(
        `https://huggingface.co/${modelRepo}/resolve/main`
      );
    }

    const [
      tokenizer,
      model,
      config,
      stConfig,
      dense,
      denseConfig,
      tokensConfig,
    ] = modelFiles;

    // Instantiate the model with the loaded files
    const colbertModel = new ColBERT(
      model,
      dense,
      tokenizer,
      config,
      stConfig,
      denseConfig,
      tokensConfig,
      32
    );

    // You can now use `colbertModel` for encoding
    console.log("Model loaded successfully!");
    return colbertModel;
  } catch (error) {
    console.error("Model Loading Error:", error);
  }
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

pylate_rs-1.0.2-cp313-cp313-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.13Windows x86-64

pylate_rs-1.0.2-cp313-cp313-manylinux_2_39_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.2-cp313-cp313-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

pylate_rs-1.0.2-cp312-cp312-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.12Windows x86-64

pylate_rs-1.0.2-cp312-cp312-manylinux_2_39_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.2-cp312-cp312-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

pylate_rs-1.0.2-cp311-cp311-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.11Windows x86-64

pylate_rs-1.0.2-cp311-cp311-manylinux_2_39_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.2-cp311-cp311-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

pylate_rs-1.0.2-cp310-cp310-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.10Windows x86-64

pylate_rs-1.0.2-cp310-cp310-manylinux_2_39_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.2-cp310-cp310-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

pylate_rs-1.0.2-cp39-cp39-win_amd64.whl (3.3 MB view details)

Uploaded CPython 3.9Windows x86-64

pylate_rs-1.0.2-cp39-cp39-manylinux_2_39_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.39+ x86-64

pylate_rs-1.0.2-cp39-cp39-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

File details

Details for the file pylate_rs-1.0.2-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.2-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.2-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 3c6230663e66d8118e643035d273c88f98e6f8feda1696a7d10bf807b52a7d9a
MD5 0c6d282fc3b4f0a35d69203c82faffdd
BLAKE2b-256 57ebae4d6867e5f3c143b4d6ce502aad1c641041bc767df429bc7fc89354f37b

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp313-cp313-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp313-cp313-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 1db0070b3eba665f19e1aedfcf4005352bc62e120d3adcb10922a7bb7683dc13
MD5 274b91210fb4c9a35996c04cfb05b2be
BLAKE2b-256 c5443683893bae582731219b9426cd5e645cc0750636284aeae1c2b734ce190e

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 9f1afd16be353b1f1154be6cea82699d55d4009129d0e3f8aaa5f307f27236f9
MD5 5d9ec3704d9024e8579616d28df8ca87
BLAKE2b-256 946cbea43b75e0e37fe50fe00cd1116f49fe0f1514013a878c5bd9d4343fe854

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.2-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.2-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 6422b1f376daaa9457a0f815f320d8c3da60bf25c00956b728edaf32673751e1
MD5 a64cb06676aac8f87d7c4ff9ca375da2
BLAKE2b-256 979787d62ff440eba84fd7fd8ad2a66b574106a09ed11ceec6cba97cfad42883

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp312-cp312-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp312-cp312-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 1f0aea5972dc5b495fdf818ef4afac9c6e6458737afef66055bf8ff88104a55f
MD5 0df0da4015e2f94b7570ceb6c7d63ea1
BLAKE2b-256 c426b48bae683280b88bb2da1b1afe444ca6286427bfb8edb1d2a6616978d38f

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d6789dd22b7b21f9c40c6fbd911e26fd5df956e4d23c74c4d173d7ad9152c82f
MD5 78b3d1ee5244569f4f7c2bf4e0c43bd2
BLAKE2b-256 3af9312244f17848290db83931bb6fa741ad08d71e7ca095da55d8a1dbbc5e18

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.2-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.2-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 f8394b3ccb61fff4a6a2cf0b68356c36d7d7eec567f3b8bb8022df7e4836c951
MD5 153e32c46e1f8d1e55afefb6e51ee387
BLAKE2b-256 3bdc6fdf7ebc936264921cc89d4653b2e1443197963620d6540cdf5cad815837

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp311-cp311-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp311-cp311-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 1fba16d95298c1e82b83e1fd7fc64ba66e391c569c0daf387b4b6b8d5aeed771
MD5 2c43bbc293f346cfb1cd9018bd1a3b93
BLAKE2b-256 1bd8c7fa23e5a9233897a58c9b75ef43f6b489e84d6e774ca41fab733e7a2866

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 0764c694e8a54aeed3c47ef7edeb770d1979d2a292aae928b3a5d9629c98e1c3
MD5 afce5e2a13ecde4310b5d7b1050e55c6
BLAKE2b-256 48b8985e96d55ae3a17167dfdf4413a4404e1e19c2da15ba5e6a09303b840789

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.2-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.2-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 5624bbcaff1f8ceb59b686f2a1bc7917ad3d41c11cfa31fe1f6a4a55844996fc
MD5 1cea902a0318df0beda300b969032f94
BLAKE2b-256 b55c33362db64e60ea5f42dc0cf61f1051a90176a2a5d5cab66cdc9047b7983a

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp310-cp310-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp310-cp310-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 b93962ec5556ac84cc3441ded7a45f1e11789979c0f2799d479bddca06de70b3
MD5 91548294035e57b1e0c61402bb2a6d3a
BLAKE2b-256 d405d8cb25251bf1200a3524a565f2bf634f3a506e987cbcb1153825ff2cf17f

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 65a607ac6ce95b27c85ebd18a41899d7f1080f98c84547a6250d078c2d3f0469
MD5 543b2286ef718b3be382f4e9c1bd02c2
BLAKE2b-256 da41fc019fd24715e69f24bb4f047765f6407f2095d3d2aa38244ce443cb91ef

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: pylate_rs-1.0.2-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for pylate_rs-1.0.2-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 048a973edc4bfcfb8709aa9ba2c29ff769e2578713b4bb96681f50f97aeae51d
MD5 43a1fac3de36ae1d1ae68ab90ad92200
BLAKE2b-256 852d9ed4103a60c5295400a16b6eb1e349643e4666074195eaafa66b8449034d

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp39-cp39-manylinux_2_39_x86_64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp39-cp39-manylinux_2_39_x86_64.whl
Algorithm Hash digest
SHA256 d4f229fbe5e35f6938a6b3995a7449242de98ed5ff641807412dde6c78b7b275
MD5 a645865926ff91b6b0f269184e1e32f5
BLAKE2b-256 ca6304ed67944b6ff9c3443e47af0a33cf989484a634c42674a0d482ac9628dc

See more details on using hashes here.

File details

Details for the file pylate_rs-1.0.2-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pylate_rs-1.0.2-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 cc3d697302225f2c018ac0d2fbc5f9c284ab349ef0c180c5e8ad285fa47e28fd
MD5 64d60650a1e73863dfc2c91f0ab30590
BLAKE2b-256 5d467367dd707a2e41feadaadbc333b2af1bcb9b16c840b0390aceeefb424bcb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page