Skip to main content

A ultra-high performance package for sending requests to Baseten Embedding Inference'

Reason this release was yanked:

renamed to pip install baseten-performance-client==0.0.1

Project description

High performance client for Baseten.co

This library provides a high-performance Python client for Baseten.co endpoints including embeddings, reranking, and classification. It was built for massive concurrent post requests to any URL, also outside of baseten.co. InferenceClient releases the GIL while performing requests in the Rust, and supports simulaneous sync and async usage. It was benchmarked with >1200 rps from a single-core machine on baseten.co. InferenceClient is built on top of pyo3, reqwest and tokio and is MIT licensed.

Installation

pip install baseten_inference_client

Usage

import os
import asyncio
from baseten_inference_client import InferenceClient, OpenAIEmbeddingsResponse, RerankResponse, ClassificationResponse

api_key = os.environ.get("BASETEN_API_KEY")
base_url_embed = "https://model-yqv0rjjw.api.baseten.co/environments/production/sync"
# Also works with 3rd party endpoints.
# base_url_embed = "https://api.openai.com" or "https://api.mixedbread.com"
client = InferenceClient(base_url=base_url_embed, api_key=api_key)

Embeddings

Synchronous Embedding

texts = ["Hello world", "Example text", "Another sample"]
response = client.embed(
    input=texts,
    model="my_model",
    batch_size=4,
    max_concurrent_requests=32,
    timeout_s=360
)

# Accessing embedding data
print(f"Model used: {response.model}")
print(f"Total tokens used: {response.usage.total_tokens}")

for i, embedding_data in enumerate(response.data):
    print(f"Embedding for text {i} (original input index {embedding_data.index}):")
    # embedding_data.embedding can be List[float] or str (base64)
    if isinstance(embedding_data.embedding, list):
        print(f"  First 3 dimensions: {embedding_data.embedding[:3]}")
        print(f"  Length: {len(embedding_data.embedding)}")

# Using the numpy() method (requires numpy to be installed)
import numpy as np
numpy_array = response.numpy()
print("\nEmbeddings as NumPy array:")
print(f"  Shape: {numpy_array.shape}")
print(f"  Data type: {numpy_array.dtype}")
if numpy_array.shape[0] > 0:
    print(f"  First 3 dimensions of the first embedding: {numpy_array[0][:3]}")

Note: The embed method is versatile and can be used with any embeddings service, e.g. OpenAI API embeddings, not just for Baseten deployments.

Asynchronous Embedding

async def async_embed():
    texts = ["Async hello", "Async example"]
    response = await client.async_embed(
        input=texts,
        model="my_model",
        batch_size=2,
        max_concurrent_requests=16,
        timeout_s=360
    )
    print("Async embedding response:", response.data)

# To run:
# asyncio.run(async_embed())

Embedding Benchmarks

Comparison against pip install openai for /v1/embeddings. Tested with the ./scripts/compare_latency_openai.py with mini_batch_size of 128, and 4 server-side replicas. Results with OpenAI similar, OpenAI allows a max mini_batch_size of 2048.

Number of inputs / embeddings Number of Tasks InferenceClient (s) AsyncOpenAI (s) Speedup
128 1 0.12 0.13 1.08×
512 4 0.14 0.21 1.50×
8 192 64 0.83 1.95 2.35×
131 072 1 024 4.63 39.07 8.44×
2 097 152 16 384 70.92 903.68 12.74×

Gerneral Batch POST

The batch_post method is generic. It can be used to send POST requests to any URL, not limited to Baseten endpoints. The input and output can be any JSON item.

Synchronous Batch POST

payload1 = {"model": "my_model", "input": ["Batch request sample 1"]}
payload2 = {"model": "my_model", "input": ["Batch request sample 2"]}
response1, response2 = client.batch_post(
    url_path="/v1/embeddings",
    payloads=[payload, payload],
    max_concurrent_requests=96,
    timeout_s=360
)
print("Batch POST responses:", response1, response2)

Asynchronous Batch POST

async def async_batch_post():
    payload = {"model": "my_model", "input": ["Async batch sample"]}
    responses = await client.async_batch_post(
        url_path="/v1/embeddings",
        payloads=[payload, payload],
        max_concurrent_requests=4,
        timeout_s=360
    )
    print("Async batch POST responses: list[Any]", responses)

# To run:
# asyncio.run(async_batch_post())

Reranking

Reranking compatible with BEI or text-embeddings-inference.

Synchronous Reranking

query = "What is the best framework?"
documents = ["Doc 1 text", "Doc 2 text", "Doc 3 text"]
rerank_response = client.rerank(
    query=query,
    texts=documents,
    return_text=True,
    batch_size=2,
    max_concurrent_requests=16,
    timeout_s=360
)
for res in rerank_response.data:
    print(f"Index: {res.index} Score: {res.score}")

Asynchronous Reranking

async def async_rerank():
    query = "Async query sample"
    docs = ["Async doc1", "Async doc2"]
    response = await client.async_rerank(
        query=query,
        texts=docs,
        return_text=True,
        batch_size=1,
        max_concurrent_requests=8,
        timeout_s=360
    )
    for res in response.data:
        print(f"Async Index: {res.index} Score: {res.score}")

# To run:
# asyncio.run(async_rerank())

Classification

Predicy (classification endpoint) compatible with BEI or text-embeddings-inference.

Synchronous Classification

texts_to_classify = [
    "This is great!",
    "I did not like it.",
    "Neutral experience."
]
classify_response = client.classify(
    inputs=texts_to_classify,
    batch_size=2,
    max_concurrent_requests=16,
    timeout_s=360
)
for group in classify_response.data:
    for result in group:
        print(f"Label: {result.label}, Score: {result.score}")

Asynchronous Classification

async def async_classify():
    texts = ["Async positive", "Async negative"]
    response = await client.async_classify(
        inputs=texts,
        batch_size=1,
        max_concurrent_requests=8,
        timeout_s=360
    )
    for group in response.data:
        for res in group:
            print(f"Async Label: {res.label}, Score: {res.score}")

# To run:
# asyncio.run(async_classify())

Error Handling

The client can raise several types of errors. Here's how to handle common ones:

  • requests.exceptions.HTTPError: This error is raised for HTTP issues, such as authentication failures (e.g., 403 Forbidden if the API key is wrong), server errors (e.g., 5xx), or if the endpoint is not found (404). You can inspect e.response.status_code and e.response.text (or e.response.json() if the body is JSON) for more details.
  • ValueError: This error can occur due to invalid input parameters (e.g., an empty input list for embed, invalid batch_size or max_concurrent_requests values). It can also be raised by response.numpy() if embeddings are not float vectors or have inconsistent dimensions.

Here's an example demonstrating how to catch these errors for the embed method:

import requests

# client = InferenceClient(base_url="your_b10_url", api_key="your_b10_api_key")

texts_to_embed = ["Hello world", "Another text example"]
try:
    response = client.embed(
        input=texts_to_embed,
        model="your_embedding_model", # Replace with your actual model name
        batch_size=2,
        max_concurrent_requests=4,
        timeout_s=60 # Timeout in seconds
    )
    # Process successful response
    print(f"Model used: {response.model}")
    print(f"Total tokens: {response.usage.total_tokens}")
    for item in response.data:
        embedding_preview = item.embedding[:3] if isinstance(item.embedding, list) else "Base64 Data"
        print(f"Index {item.index}, Embedding (first 3 dims or type): {embedding_preview}")

except requests.exceptions.HTTPError as e:
    print(f"An HTTP error occurred: {e}, code {e.args[0]}")

For asynchronous methods (async_embed, async_rerank, async_classify, async_batch_post), the same exceptions will be raised by the await call and can be caught using a try...except block within an async def function.

Development

# Install prerequisites
sudo apt-get install patchelf
# Install cargo if not already installed.

# Set up a Python virtual environment
python -m venv .venv
source .venv/bin/activate

# Install development dependencies
pip install maturin[patchelf] pytest requests numpy

# Build and install the Rust extension in development mode
maturin develop
cargo fmt
# Run tests
pytest tests

Contributions

Feel free to contribute to this repo, tag @michaelfeil for review.

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

baseten_inference_client-0.0.1.tar.gz (35.8 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ x86-64

baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_i686.whl (4.0 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ i686

baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_armv7l.whl (3.7 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ ARMv7l

baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_aarch64.whl (4.4 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ ARM64

baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ x86-64

baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (4.0 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ppc64le

baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl (4.0 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ i686

baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.4 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ARMv7l

baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.1 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ARM64

baseten_inference_client-0.0.1-cp313-cp313t-macosx_11_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.13tmacOS 11.0+ ARM64

baseten_inference_client-0.0.1-cp313-cp313t-macosx_10_12_x86_64.whl (1.7 MB view details)

Uploaded CPython 3.13tmacOS 10.12+ x86-64

baseten_inference_client-0.0.1-cp38-abi3-win_amd64.whl (1.5 MB view details)

Uploaded CPython 3.8+Windows x86-64

baseten_inference_client-0.0.1-cp38-abi3-win32.whl (1.4 MB view details)

Uploaded CPython 3.8+Windows x86

baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ x86-64

baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_i686.whl (4.1 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ i686

baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_armv7l.whl (3.7 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARMv7l

baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_aarch64.whl (4.4 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARM64

baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ x86-64

baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (4.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ppc64le

baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl (4.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ i686

baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.4 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARMv7l

baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.1 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARM64

baseten_inference_client-0.0.1-cp38-abi3-macosx_11_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

baseten_inference_client-0.0.1-cp38-abi3-macosx_10_12_x86_64.whl (1.8 MB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file baseten_inference_client-0.0.1.tar.gz.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1.tar.gz
Algorithm Hash digest
SHA256 df790fcf3f837bcef94fa4c9253393e29c6616a37b32e3dada2efde7da0c7f87
MD5 714c32ea614fa798da60a4f34e205bfa
BLAKE2b-256 bc34e0f49e81b514b38c0d20a872f85d82e7a0e0d10be13d887cebb43fdb4234

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 41ff5d82c51e6a652d538c06b78072eaf8e3e8743d1442bed9ee910978d00afd
MD5 8ac00ca59373952064de7451adcd6c42
BLAKE2b-256 f4a0bc3881ab8583e37d6956138c1d235321461cd41032b33aedff28ae9e6480

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 59035bba3e299c6068b1d8292c201adb88a01107bcd92bc950fc58a4af5f9a79
MD5 bd9515e2de0ddd33a7914bc8fb7e33f9
BLAKE2b-256 be98aaddeb818f5130c8fcb1643b5dc59beb9bb56e16e78af6feb8a8abb65ab9

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 527b28286ca0f6511b21088c87f95dae5fbb36e892c411fbce7014613912f9c4
MD5 c805feddadefdab9744836bf7bb48339
BLAKE2b-256 15e5e895a6cf7c4ce5aab48a155d61cc02b66ba2918e971b58dcdc598fa0b6c8

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 07667c3f1af860a2a022c72046f4208b219bc5338e45fa20c1944febfc5217b9
MD5 4f6b65585a367c9602b841a29ae5a3e7
BLAKE2b-256 b277bf7e003ee1e4bf3ff6216d1b269733ecf45d0814e5548289811359923af9

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 30304b1971c73c2f6b37f1241c5d635afdff8f4e2f89a0553b56c7253b4f1e17
MD5 8c6801837ff9a6609059aa2b4b37a4c9
BLAKE2b-256 743effc7858a2bfd0789dd3caba232128d88fa4907ea547220c65d775a6eaab6

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 c5fdc908522cff7566f0595d3d72b66ca3430d3d94a9499661676b7fa4613205
MD5 2f8b79a63984c5f728b5798d373df508
BLAKE2b-256 1cd553248142dcffa1e6f8683797eebc1f54e8f1fe48c3a667785a2129c7694e

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 389a0ed52f3ee8af061b9f5e5477edaa9b84cfec0dfbbdaf0dda80b3e18855ad
MD5 c05b6cab56e230bc75c2e34a651a6051
BLAKE2b-256 8033345b73fee2d25ddf8764fe266c999ff2ed114f2595c13c4d51f186f4d299

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 582f06283cc08773efd57fd46f5760b9000a702557dda45e7b8c1ef31642bb03
MD5 341990ed4bd5e772dd18b778e8fa5aec
BLAKE2b-256 1e318f0d2fb79c53334b1d9ea6ee892ec60503bfd5ab3150ca099ce23388bdf7

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 85ce43f0e688e33610a13484db61fb1cd49409446294e43ab64cb0bf6e240eca
MD5 5343784642175b0463b548f435cea7c6
BLAKE2b-256 12827151e72f6a44ee827e39f50c312a35223f378b8c28c00dc2a89ee8127a9d

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 400fcfbb8d4d031326798c45d9538f01085c686a09ac2af50ee70e6b7a750900
MD5 8cf8d508647bc15355ede6d6d91d02a5
BLAKE2b-256 870f91bc10e7b970cae01ee92d9581c02a124f235453c5a1a05ee0e382315e1b

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp313-cp313t-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp313-cp313t-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 d7888cf55d28c1d164ead318d2d50d371aa2f6792ab4bef836c8790be4c8c031
MD5 c7b1ea4f6be9b13ecc4b6e71533722e4
BLAKE2b-256 53a4b34a1c48650048d3292e07732a618cf0850d69c1a168974276df73446a3b

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 208cf15d60444e5a91ac02270679ec94cdea261ae3637c03dad8f05034a6ddf5
MD5 24bf1143e65ff07ca2064cd00b1df626
BLAKE2b-256 f52b887a981e05b818ad205dd167a22b15cf6073b4396bebc5c0e6a6b2f1d706

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-win32.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-win32.whl
Algorithm Hash digest
SHA256 98a44ae916b62a656140b0270dbbe353e69e1ea1718575977a7b7f4204526399
MD5 f383c432d19a83681387379375da6191
BLAKE2b-256 3af63b78d65dc02eadc21111ede62406514b09d6378ca0ef766a28ad813cefff

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 6e2ac88f6531f06fa89271fbca5f9204a8bbbb56e434f5ce607d5116d53c6d3c
MD5 44d96cbc48d8285961d45c163f257fe0
BLAKE2b-256 5a9b320a7df3732c79d6bc39bd82a73aa2b12ce19e1b9b6abc5bdb19769b8e77

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 73d2297f37c22cd6e116d52315974dc142599f5ff4cbfccdb535f8d24da6fa16
MD5 6167157a5c2b26d11347efe50650b0f3
BLAKE2b-256 3a2ec50e587307d223c1e3a3dd5ef5358200e053866de3ef2990ac73f6161f1e

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 3fa43b268638e583345b8ab0fd658dc173380fc3b0c83a2da7be9751a1742dd8
MD5 6ac73bc0f895b5efe2e14b1a3c77a428
BLAKE2b-256 41a6407ce7acb8ab863c8043235b64d5eb8c7f346071c98c3e54e9fbae24b7a7

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 42edecc9d50202704c4ed860123ff480733ad5df18d665be07cfd6d47647e354
MD5 aedaf366d3ce9b2a3e22c2b63c34b4ea
BLAKE2b-256 c7af3bfc6323df8ac49f745905bc0126fffeff9009122617bd8c94ac0019303a

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 77a8c52c7eb1ba82c1b6d03c3a72c216b0515e92d04f92116438b0ef5d7582d7
MD5 0a453a0275f5217d02ddc3a223156c1f
BLAKE2b-256 8815c47b0a89edc03d47f8ef37b52837507d895c9cb48b10b213ddcef4abe098

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 650357715472090e09ff32dfa17471743437773f1cb4f17a091b22cc210bee70
MD5 179205848688edec0fa6d3cbae7fe587
BLAKE2b-256 abf0d6ee2a1a0fd78833ff8ec470fe5b9dd958e6c112bdbde8794de4ba0f5c81

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 b11ec4bdbdf446018b4bbc8376ef6473e885603ee50415fdd6634a013c782245
MD5 8176059fc2fee94dfe664b2e33919daa
BLAKE2b-256 4dde0494e0e7e1a44780bee61a45eaf151df6f3426a28b4ba6635c95c7587737

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 5df59cbffa57e829d8010836347d917d1b5edc2cf51eecafbefd6a1ddbcfe058
MD5 dbd6ca2a10e1752fc8e77b50f0c2c97c
BLAKE2b-256 47b58daecc56ab0495b3778bf9c20b886c057fab862d34aba632152cb2c931a6

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 791b96a4d41eec347ec9299cb34fc9d908de04079875ec9ec5c758d1adbc1eaf
MD5 56f36ec5883d54616e976211a18b7dca
BLAKE2b-256 20a3834a5552bdc2449d67eae6c493656da26c446b33b4d20c25abeb933e23b9

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 56399bee9446c8dfbd018076d29dd9bf50c84837ecc67b42b556128bab07e094
MD5 aa415b0177969d0a0e4e5b3119b27247
BLAKE2b-256 f51705496442040e951538351d28d9b4190384312a39e0dfc5f8c3129f7af214

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 b8937e7e01abd48c598b2ac72657a2f7d2ac43b981725e1250b109ccd4f02a2b
MD5 620e43ce75081af6c48b59130ac3115f
BLAKE2b-256 b7f32ac5b35d3c9f6f2a4d2a279b87e77362e4f9065a3e45a7784f403ceb77eb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page