Skip to main content

A ultra-high performance package for sending requests to Baseten Embedding Inference'

Reason this release was yanked:

renamed to pip install baseten-performance-client==0.0.1

Project description

High performance client for Baseten.co

This library provides a high-performance Python client for Baseten.co endpoints including embeddings, reranking, and classification. It was built for massive concurrent post requests to any URL, also outside of baseten.co. InferenceClient releases the GIL while performing requests in the Rust, and supports simulaneous sync and async usage. It was benchmarked with >1200 rps from a single-core machine on baseten.co. InferenceClient is built on top of pyo3, reqwest and tokio and is MIT licensed.

Installation

pip install baseten_inference_client

Usage

import os
import asyncio
from baseten_inference_client import InferenceClient, OpenAIEmbeddingsResponse, RerankResponse, ClassificationResponse

api_key = os.environ.get("BASETEN_API_KEY")
api_base_embed = "https://model-yqv0rjjw.api.baseten.co/environments/production/sync"
# Also works with 3rd party endpoints.
# api_base_embed = "https://api.openai.com" or "https://api.mixedbread.com"
client = InferenceClient(api_base=api_base_embed, api_key=api_key)

Synchronous Embedding

texts = ["Hello world", "Example text", "Another sample"]
response = client.embed(
    input=texts,
    model="my_model",
    batch_size=4,
    max_concurrent_requests=32,
    timeout_s=360
)

# Accessing embedding data
print(f"Model used: {response.model}")
print(f"Total tokens used: {response.usage.total_tokens}")

for i, embedding_data in enumerate(response.data):
    print(f"Embedding for text {i} (original input index {embedding_data.index}):")
    # embedding_data.embedding can be List[float] or str (base64)
    if isinstance(embedding_data.embedding, list):
        print(f"  First 3 dimensions: {embedding_data.embedding[:3]}")
        print(f"  Length: {len(embedding_data.embedding)}")

# Using the numpy() method (requires numpy to be installed)
import numpy as np
numpy_array = response.numpy()
print("\nEmbeddings as NumPy array:")
print(f"  Shape: {numpy_array.shape}")
print(f"  Data type: {numpy_array.dtype}")
if numpy_array.shape[0] > 0:
    print(f"  First 3 dimensions of the first embedding: {numpy_array[0][:3]}")

Note: The embed method is versatile and can be used with any embeddings service, e.g. OpenAI API embeddings, not just for Baseten deployments.

Asynchronous Embedding

async def async_embed():
    texts = ["Async hello", "Async example"]
    response = await client.aembed(
        input=texts,
        model="my_model",
        batch_size=2,
        max_concurrent_requests=16,
        timeout_s=360
    )
    print("Async embedding response:", response.data)

# To run:
# asyncio.run(async_embed())

Synchronous Batch POST

payload1 = {"model": "my_model", "input": ["Batch request sample 1"]}
payload2 = {"model": "my_model", "input": ["Batch request sample 2"]}
response1, response2 = client.batch_post(
    url_path="/v1/embeddings",
    payloads=[payload, payload],
    max_concurrent_requests=96,
    timeout_s=360
)
print("Batch POST responses:", response1, response2)

Note: The batch_post method is generic. It can be used to send POST requests to any URL, not limited to Baseten endpoints.

Asynchronous Batch POST

async def async_batch_post():
    payload = {"model": "my_model", "input": ["Async batch sample"]}
    responses = await client.abatch_post(
        url_path="/v1/embeddings",
        payloads=[payload, payload],
        max_concurrent_requests=4,
        timeout_s=360
    )
    print("Async batch POST responses: list[Any]", responses)

# To run:
# asyncio.run(async_batch_post())

Synchronous Reranking

query = "What is the best framework?"
documents = ["Doc 1 text", "Doc 2 text", "Doc 3 text"]
rerank_response = client.rerank(
    query=query,
    texts=documents,
    return_text=True,
    batch_size=2,
    max_concurrent_requests=16,
    timeout_s=360
)
for res in rerank_response.data:
    print(f"Index: {res.index} Score: {res.score}")

Asynchronous Reranking

async def async_rerank():
    query = "Async query sample"
    docs = ["Async doc1", "Async doc2"]
    response = await client.arerank(
        query=query,
        texts=docs,
        return_text=True,
        batch_size=1,
        max_concurrent_requests=8,
        timeout_s=360
    )
    for res in response.data:
        print(f"Async Index: {res.index} Score: {res.score}")

# To run:
# asyncio.run(async_rerank())

Synchronous Classification

texts_to_classify = [
    "This is great!",
    "I did not like it.",
    "Neutral experience."
]
classify_response = client.classify(
    inputs=texts_to_classify,
    batch_size=2,
    max_concurrent_requests=16,
    timeout_s=360
)
for group in classify_response.data:
    for result in group:
        print(f"Label: {result.label}, Score: {result.score}")

Asynchronous Classification

async def async_classify():
    texts = ["Async positive", "Async negative"]
    response = await client.aclassify(
        inputs=texts,
        batch_size=1,
        max_concurrent_requests=8,
        timeout_s=360
    )
    for group in response.data:
        for res in group:
            print(f"Async Label: {res.label}, Score: {res.score}")

# To run:
# asyncio.run(async_classify())

Development

# Install prerequisites
sudo apt-get install patchelf
# Install cargo if not already installed.

# Set up a Python virtual environment
python -m venv .venv
source .venv/bin/activate

# Install development dependencies
pip install maturin[patchelf] pytest requests numpy

# Build and install the Rust extension in development mode
maturin develop
cargo fmt
# Run tests
pytest tests

Error Handling

The client can raise several types of errors. Here's how to handle common ones:

  • requests.exceptions.HTTPError: This error is raised for HTTP issues, such as authentication failures (e.g., 403 Forbidden if the API key is wrong), server errors (e.g., 5xx), or if the endpoint is not found (404). You can inspect e.response.status_code and e.response.text (or e.response.json() if the body is JSON) for more details.
  • ValueError: This error can occur due to invalid input parameters (e.g., an empty input list for embed, invalid batch_size or max_concurrent_requests values). It can also be raised by response.numpy() if embeddings are not float vectors or have inconsistent dimensions.

Here's an example demonstrating how to catch these errors for the embed method:

import requests # Make sure to import requests to catch its specific exceptions

# client = InferenceClient(api_base="your_api_base", api_key="your_api_key")

texts_to_embed = ["Hello world", "Another text example"]
try:
    response = client.embed(
        input=texts_to_embed,
        model="your_embedding_model", # Replace with your actual model name
        batch_size=2,
        max_concurrent_requests=4,
        timeout_s=60 # Timeout in seconds
    )
    # Process successful response
    print(f"Model used: {response.model}")
    print(f"Total tokens: {response.usage.total_tokens}")
    for item in response.data:
        embedding_preview = item.embedding[:3] if isinstance(item.embedding, list) else "Base64 Data"
        print(f"Index {item.index}, Embedding (first 3 dims or type): {embedding_preview}")

except requests.exceptions.HTTPError as e:
    print(f"An HTTP error occurred: {e}, code {e.args[0]}")

For asynchronous methods (aembed, arerank, aclassify, abatch_post), the same exceptions will be raised by the await call and can be caught using a try...except block within an async def function.

Contributions

Feel free to contribute to this repo, tag @michaelfeil for review.

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

baseten_inference_client-0.0.1rc3.tar.gz (35.1 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ x86-64

baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_i686.whl (4.0 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ i686

baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_armv7l.whl (3.7 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ ARMv7l

baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_aarch64.whl (4.4 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ ARM64

baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ x86-64

baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (4.0 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ppc64le

baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl (4.0 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ i686

baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.4 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ARMv7l

baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.1 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ARM64

baseten_inference_client-0.0.1rc3-cp313-cp313t-macosx_11_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.13tmacOS 11.0+ ARM64

baseten_inference_client-0.0.1rc3-cp313-cp313t-macosx_10_12_x86_64.whl (1.7 MB view details)

Uploaded CPython 3.13tmacOS 10.12+ x86-64

baseten_inference_client-0.0.1rc3-cp38-abi3-win_amd64.whl (1.5 MB view details)

Uploaded CPython 3.8+Windows x86-64

baseten_inference_client-0.0.1rc3-cp38-abi3-win32.whl (1.4 MB view details)

Uploaded CPython 3.8+Windows x86

baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ x86-64

baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_i686.whl (4.1 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ i686

baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_armv7l.whl (3.7 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARMv7l

baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_aarch64.whl (4.4 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARM64

baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ x86-64

baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (4.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ppc64le

baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl (4.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ i686

baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.4 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARMv7l

baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.1 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARM64

baseten_inference_client-0.0.1rc3-cp38-abi3-macosx_11_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

baseten_inference_client-0.0.1rc3-cp38-abi3-macosx_10_12_x86_64.whl (1.7 MB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file baseten_inference_client-0.0.1rc3.tar.gz.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3.tar.gz
Algorithm Hash digest
SHA256 11385974a2ab9536331b5e8a7402f807876538b2d6c95b181aee97d8e6a26a04
MD5 4bef529f24433855798e301b119e897a
BLAKE2b-256 149b4d5cffebda39ca911559d77e27bb75a65795da14a7e4afeb433711e1d56e

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 ce281c7a0448b9f22b44001de48d2b5a1a900b39eb4199353d9f0cd6fd0e2ce5
MD5 e1f3b8453a9e977757f53422fddb6520
BLAKE2b-256 d223b83bfcfff924b06a830e4aa8f9fad5a32d85e8048652e9f51ea779594d2d

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 f12a8bf24752de071f1a8180e0d66501fdde6badbe395fee8670a62d1bac5727
MD5 c9915471587ab9826e4456c9151fe422
BLAKE2b-256 4bd6ecea3bc5c9099fb3c6be1002e843a54641e4263b7265b558f4af40296435

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 36e6417892a366245acb09f680e88daa7016493753bfbad522ebd7d7836b9955
MD5 41a65067df7631292ff5212be3eee8db
BLAKE2b-256 38d42acb50d7bcb9425ca6037d630e5a8ce40de1092ebad7e50e245947454dfc

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 e8517d0fb327304cbc6f86776781c87308d0fc9ccc921ad088260255ce0dff75
MD5 20d3f560fceff96dfa0bf81cee39c72e
BLAKE2b-256 d1bc74522e1d59d5a60a8ffcc0c3b17e055d0f5dd54a3ed774ac5f3d6c4c8d15

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6931c73951eb4a5ac3ecf13b2d1000f5c20cc264324b1474d4133ae849b2de38
MD5 d050905dcc52179170996b734230df47
BLAKE2b-256 414c9685df72862b40310811770479bbda845e61c25e1bcd939b3b7f13b56534

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 fcfb7de5cb027ada826626da45ac9476796c0026f33be9926e7e7f7590f228f0
MD5 9d9414dba9d36fc497120ad78ad63635
BLAKE2b-256 b28dd6922a0149ae4418ef54347597853152783038e8e83a04d64379ba2bfffc

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 a9d67b20f3b4b5173617664f6351973fd9397e18a8256b80b7162b9d45d862d6
MD5 9efac9756d8354e18d81e392195a80a2
BLAKE2b-256 09e470a0a1fe411bc1f8116596160995d3ec023164be2e43c04472d5fa19f5bb

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 b22bb3c5347b8ec4b9e96f89ee0b72e3318244cc1118fa43f08b307cd9dec38b
MD5 018ed95ab8fecdd39ba069e56b5b3231
BLAKE2b-256 dc8f1e57ffaf0615f64035767b037eb7de85ff476b05af42396d9c68a8e54ce3

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 0e945c95e90f0bdf2bc1f18535e810c2f0847c03f502e989f687c7ac5686aa90
MD5 0c7527dfbf289f3243e4fefbd61e8c78
BLAKE2b-256 77121aefec6f5684a0125024e9e426547f19469bfda439faec25aef7a3bbfd43

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1e5079016c5bd80dad729ff846da7c10ff19a6654a4c3008dd6ac30d41b928c7
MD5 3d5375d85e2e7122f85d396d4cc0f216
BLAKE2b-256 c0170aad576794dfd990210ce87a0b4dc7b82415952c0390383619499b732578

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp313-cp313t-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp313-cp313t-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 4231b71bdd78fc9fa530a3960466e657c2054ef7b954906c8c1fa9af4af9b807
MD5 168bd5be41d4205fc805b0744fa7edea
BLAKE2b-256 4d498aafe0fd0ace4cda4cb74c94dac484547de134b421115f24949555ffeb8f

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 62def7e4b1e0ecb036293d0022c4d2ad2c3f3d7eeca6fdfcb97166418ad6857f
MD5 845fe4eca88d1b3def12cdb433ed3b5c
BLAKE2b-256 c4662a19cabe664f89fb64063d111812a032d33f7618708cdf7a82a128a35b18

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-win32.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-win32.whl
Algorithm Hash digest
SHA256 bc59231164bfe65e50bdff8404cebc392097fa5dfa61e415ff30ab8c593e6c60
MD5 dd9e52fc56a13ec9570145d9a252f8ba
BLAKE2b-256 0a017a44125da878cb3882ca246347105f48efb8ca16a73be2c322e7dd0a128d

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 3e12be7adcb28ad86d4018acbc77675aabadcb55362a180f7c17ac125ef6cb09
MD5 31986c2ceebfb15c278988917989581e
BLAKE2b-256 a4317ea1c4b0a2eb06b0c52aa4e1d2464207c9bcf811aa604f33591e67aae13f

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 b89d075ca02fce72d79a8270543c71a29f42e8b58ab318dcf1cd0a1c8471abc6
MD5 37a6e19ce33a65184eaa3e30a9f4d4a4
BLAKE2b-256 4e4859897c2db2030949077de5608bc1f6c679d1a4f9f7f124a8e7d95c2cb9fc

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 f5d29853ebb2d29119829821358288cb788e207aaef5acd62416a4f020da14ee
MD5 a87cea2c2dcf419590d1e0369edf9027
BLAKE2b-256 b1ed00cc49b8dea43e96ffbd154358863c99538bf8f2a0bfcb685a3c8a3001e9

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 ab6446bc083460b163bd251751d36e67f6dadf158f6749dc8133a88bcceace6a
MD5 6dc179b1c7a344cf0956a65ecd716082
BLAKE2b-256 ff9ea54333bccc9f00382a420a1ed056c6538fe1bbd46a634edaa5fa91fe512c

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 68ef311e8457d48dd30975ccb936409c767bb5990f06e5ba39690d99c8fe532b
MD5 1ad0fdc940759db16150f15f0a650341
BLAKE2b-256 abee7d24f37ce21065cdc120411334f2bd8f3ca0301c0302f2d5173d56b5cd3e

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 be51ff741fdc40e33183058dcd30003795e2efdd3e938400584f57b4cdae4bd4
MD5 7fffedeea40769978c8374e5ce8bdc90
BLAKE2b-256 746796771785eedc6c1c68e7d27c08c27306b847faf625b52fa42734e223ec83

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 6973d0614baf2d58faebbaaa72acea4d34be8e38274dee1d898ad1e679ab9c78
MD5 b2ef25598bd0b7a5d0417d47b0ce26c4
BLAKE2b-256 5c2d987364f4888157c2209dbd882edff889c67ad1142f8acaa494b7f8de7775

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 a64688738377fb135a8161c22622583e25c01b56e0334f096f51877646c00b5c
MD5 e5faff894f42cd50e26728b2e66924ca
BLAKE2b-256 eead29a41068f90b5db3d45d97e083e01c359c0d49247a22752ce658053fa980

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 4a01932dd4d526a79993655e4e93ee701c2d98c03ad6c05777365a35488301e1
MD5 1c7587a8020e4a8b5b4aa2eb7f9cb33a
BLAKE2b-256 fdc1d30555190d07d47e2a0cb591a16422e8516caad889514243f708300bd4cb

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c7d40366b3016067ea43e52986492fbca7daee2d59c758f8054425eba1939da8
MD5 3aa83ff9dc703193be3888823b7e2227
BLAKE2b-256 1fe48cbbda05b2907ce48ed57f042db680f1620afe6225765650158d78acdd98

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc3-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc3-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 0c89d1605934ca8adcbf2f3a849bf3a9d8396f729e4011a6c13e2c56b5491646
MD5 2ed15028f38b9d91867e7c02ad19d4ff
BLAKE2b-256 84a6d92ed6328218dcfc2e6b31f9acd5fdccce9a1b91ed673a24f4e0b98f8e2f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page