Skip to main content

A ultra-high performance package for sending requests to Baseten Embedding Inference'

Project description

High performance client for Baseten.co

This library provides a high-performance Python client for Baseten.co endpoints including embeddings, reranking, and classification. It was built for massive concurrent post requests to any URL, also outside of baseten.co. InferenceClient releases the GIL while performing requests in the Rust, and supports simulaneous sync and async usage. It was benchmarked with >1200 rps from a single-core machine on baseten.co. InferenceClient is built on top of pyo3, reqwest and tokio and is MIT licensed.

Installation

pip install baseten_inference_client

Usage

import os
import asyncio
from baseten_inference_client import InferenceClient, OpenAIEmbeddingsResponse, RerankResponse, ClassificationResponse

api_key = os.environ.get("BASETEN_API_KEY")
api_base_embed = "https://model-yqv0rjjw.api.baseten.co/environments/production/sync"
# Also works with 3rd party endpoints.
# api_base_embed = "https://api.openai.com" or "https://api.mixedbread.com"
client = InferenceClient(api_base=api_base_embed, api_key=api_key)

Synchronous Embedding

texts = ["Hello world", "Example text", "Another sample"]
response = client.embed(
    input=texts,
    model="my_model",
    batch_size=4,
    max_concurrent_requests=32,
    timeout_s=360
)

# Accessing embedding data
print(f"Model used: {response.model}")
print(f"Total tokens used: {response.usage.total_tokens}")

for i, embedding_data in enumerate(response.data):
    print(f"Embedding for text {i} (original input index {embedding_data.index}):")
    # embedding_data.embedding can be List[float] or str (base64)
    if isinstance(embedding_data.embedding, list):
        print(f"  First 3 dimensions: {embedding_data.embedding[:3]}")
        print(f"  Length: {len(embedding_data.embedding)}")

# Using the numpy() method (requires numpy to be installed)
import numpy as np
numpy_array = response.numpy()
print("\nEmbeddings as NumPy array:")
print(f"  Shape: {numpy_array.shape}")
print(f"  Data type: {numpy_array.dtype}")
if numpy_array.shape[0] > 0:
    print(f"  First 3 dimensions of the first embedding: {numpy_array[0][:3]}")

Note: The embed method is versatile and can be used with any embeddings service, e.g. OpenAI API embeddings, not just for Baseten deployments.

Asynchronous Embedding

async def async_embed():
    texts = ["Async hello", "Async example"]
    response = await client.aembed(
        input=texts,
        model="my_model",
        batch_size=2,
        max_concurrent_requests=16,
        timeout_s=360
    )
    print("Async embedding response:", response.data)

# To run:
# asyncio.run(async_embed())

Synchronous Batch POST

payload1 = {"model": "my_model", "input": ["Batch request sample 1"]}
payload2 = {"model": "my_model", "input": ["Batch request sample 2"]}
response1, response2 = client.batch_post(
    url_path="/v1/embeddings",
    payloads=[payload, payload],
    max_concurrent_requests=96,
    timeout_s=360
)
print("Batch POST responses:", response1, response2)

Note: The batch_post method is generic. It can be used to send POST requests to any URL, not limited to Baseten endpoints.

Asynchronous Batch POST

async def async_batch_post():
    payload = {"model": "my_model", "input": ["Async batch sample"]}
    responses = await client.abatch_post(
        url_path="/v1/embeddings",
        payloads=[payload, payload],
        max_concurrent_requests=4,
        timeout_s=360
    )
    print("Async batch POST responses: list[Any]", responses)

# To run:
# asyncio.run(async_batch_post())

Synchronous Reranking

query = "What is the best framework?"
documents = ["Doc 1 text", "Doc 2 text", "Doc 3 text"]
rerank_response = client.rerank(
    query=query,
    texts=documents,
    return_text=True,
    batch_size=2,
    max_concurrent_requests=16,
    timeout_s=360
)
for res in rerank_response.data:
    print(f"Index: {res.index} Score: {res.score}")

Asynchronous Reranking

async def async_rerank():
    query = "Async query sample"
    docs = ["Async doc1", "Async doc2"]
    response = await client.arerank(
        query=query,
        texts=docs,
        return_text=True,
        batch_size=1,
        max_concurrent_requests=8,
        timeout_s=360
    )
    for res in response.data:
        print(f"Async Index: {res.index} Score: {res.score}")

# To run:
# asyncio.run(async_rerank())

Synchronous Classification

texts_to_classify = [
    "This is great!",
    "I did not like it.",
    "Neutral experience."
]
classify_response = client.classify(
    inputs=texts_to_classify,
    batch_size=2,
    max_concurrent_requests=16,
    timeout_s=360
)
for group in classify_response.data:
    for result in group:
        print(f"Label: {result.label}, Score: {result.score}")

Asynchronous Classification

async def async_classify():
    texts = ["Async positive", "Async negative"]
    response = await client.aclassify(
        inputs=texts,
        batch_size=1,
        max_concurrent_requests=8,
        timeout_s=360
    )
    for group in response.data:
        for res in group:
            print(f"Async Label: {res.label}, Score: {res.score}")

# To run:
# asyncio.run(async_classify())

Development

# Install prerequisites
sudo apt-get install patchelf
# Install cargo if not already installed.

# Set up a Python virtual environment
python -m venv .venv
source .venv/bin/activate

# Install development dependencies
pip install maturin[patchelf] pytest requests numpy

# Build and install the Rust extension in development mode
maturin develop
cargo fmt
# Run tests
pytest tests

Error Handling

The client can raise several types of errors. Here's how to handle common ones:

  • requests.exceptions.HTTPError: This error is raised for HTTP issues, such as authentication failures (e.g., 403 Forbidden if the API key is wrong), server errors (e.g., 5xx), or if the endpoint is not found (404). You can inspect e.response.status_code and e.response.text (or e.response.json() if the body is JSON) for more details.
  • ValueError: This error can occur due to invalid input parameters (e.g., an empty input list for embed, invalid batch_size or max_concurrent_requests values). It can also be raised by response.numpy() if embeddings are not float vectors or have inconsistent dimensions.

Here's an example demonstrating how to catch these errors for the embed method:

import requests # Make sure to import requests to catch its specific exceptions

# client = InferenceClient(api_base="your_api_base", api_key="your_api_key")

texts_to_embed = ["Hello world", "Another text example"]
try:
    response = client.embed(
        input=texts_to_embed,
        model="your_embedding_model", # Replace with your actual model name
        batch_size=2,
        max_concurrent_requests=4,
        timeout_s=60 # Timeout in seconds
    )
    # Process successful response
    print(f"Model used: {response.model}")
    print(f"Total tokens: {response.usage.total_tokens}")
    for item in response.data:
        embedding_preview = item.embedding[:3] if isinstance(item.embedding, list) else "Base64 Data"
        print(f"Index {item.index}, Embedding (first 3 dims or type): {embedding_preview}")

except requests.exceptions.HTTPError as e:
    print(f"An HTTP error occurred: {e}, code {e.args[0]}")

For asynchronous methods (aembed, arerank, aclassify, abatch_post), the same exceptions will be raised by the await call and can be caught using a try...except block within an async def function.

Contributions

Feel free to contribute to this repo, tag @michaelfeil for review.

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

baseten_inference_client-0.0.1rc2.tar.gz (32.6 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ x86-64

baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_i686.whl (4.0 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ i686

baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_armv7l.whl (3.7 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ ARMv7l

baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_aarch64.whl (4.4 MB view details)

Uploaded CPython 3.13tmusllinux: musl 1.2+ ARM64

baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ x86-64

baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (4.0 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ppc64le

baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl (4.0 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ i686

baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.4 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ARMv7l

baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.1 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.17+ ARM64

baseten_inference_client-0.0.1rc2-cp313-cp313t-macosx_11_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.13tmacOS 11.0+ ARM64

baseten_inference_client-0.0.1rc2-cp313-cp313t-macosx_10_12_x86_64.whl (1.7 MB view details)

Uploaded CPython 3.13tmacOS 10.12+ x86-64

baseten_inference_client-0.0.1rc2-cp38-abi3-win_amd64.whl (1.5 MB view details)

Uploaded CPython 3.8+Windows x86-64

baseten_inference_client-0.0.1rc2-cp38-abi3-win32.whl (1.4 MB view details)

Uploaded CPython 3.8+Windows x86

baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ x86-64

baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_i686.whl (4.1 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ i686

baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_armv7l.whl (3.7 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARMv7l

baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_aarch64.whl (4.4 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARM64

baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ x86-64

baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (4.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ppc64le

baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl (4.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ i686

baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.4 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARMv7l

baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.1 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARM64

baseten_inference_client-0.0.1rc2-cp38-abi3-macosx_11_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

baseten_inference_client-0.0.1rc2-cp38-abi3-macosx_10_12_x86_64.whl (1.7 MB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file baseten_inference_client-0.0.1rc2.tar.gz.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2.tar.gz
Algorithm Hash digest
SHA256 f483f20649221c757b0006547cb7f6badad5beeaffd94ddbfb406e4f1cc8fe3d
MD5 bb84d83491890ae6ce6de1728d452348
BLAKE2b-256 a3378ef8542404aecda69b77f4080a64e1955e6a92bd27ff0232c64987f5d19e

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 d77198a9d4b50c352eaeecde569ccd097da4e56a7445bae0297243e2336c37ba
MD5 c47ec829002bac3473d3a78f2286ae73
BLAKE2b-256 500a1e59b31ba1a68b3fd8539bb4af136d5219caaed7166d2eb4d0aaaa213d8e

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 d1a53184526756f63f84150b05694b43437ed052acf3448e0f6e7f63bd2430ce
MD5 81d11cc5d04eb5d1395f1a34cc358953
BLAKE2b-256 2b0b36b39093169548e3df6aa93d6a5c1f7083008394250db598ca750ba073f0

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 397c1929e33b23030f29b137932eb6994f186ea9fd2a11cdab7205b464ce0969
MD5 ab5ebfc56766ed0e7d97ea56bdb9b35b
BLAKE2b-256 b382452135f33756c5757ba7f7925837ed1e8b92b1291890ecfc74189d7e61a4

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 d336ea073f640b41a64e7088bc3e39738794847968db9944419900e5edfeb336
MD5 06f44bcb923873c21922a3dd8bb7c1db
BLAKE2b-256 47605d8bc87d79d0d006050e93a69d3eadc5d75df9ab41c1719716bede48b632

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 2e96c70827beddef9e57fba7cd727ba377a2303795c8cf20c09436a3a0238f81
MD5 2f65cece85fb1deee9ffe12840744633
BLAKE2b-256 335817d92dfccf74a72e46124b7be00df2f12297b77e6797d8bc24a3430a8b00

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 b0154d638abf1b94ae1e3798b76ef8ab698e00b30f72ca60b9d36643fd800c76
MD5 29abdbd1dd5dffdfa272eb5213315a0b
BLAKE2b-256 c3a74e40321c1445602d1bb3db7580fa98dc499e3327c023f35ec44f15bbb6c6

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 b21180445117e9c4dccc9badb94d13d0426814b434459447af11612ec9a4e3a3
MD5 442dba2bf970933bc6b7525a8b767f95
BLAKE2b-256 a38ee522e286272da32f3cb8fd1dd30fe4173c4dd6bba660ca86a9ec0a701521

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 7b33267aa14e744ddbec1ea701e285b8a8ef105448494cd1e94f5542a246dd4b
MD5 154b39324d949aac55dc6564dfeb4f33
BLAKE2b-256 335d12c9c12bb42100cb0d3ea8274001e01457803939f7e38da405e33825ca66

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 8dcf3f598bb07725a6af225fa5a1b006867c076e327a4b15871734d770c357aa
MD5 dba832c9ed7f6c69df1d9ae37ec138df
BLAKE2b-256 2ab9584a7b2cbed372b2cc3090b22ec46fb5a3484384301c13df85979ea10f92

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c57702211726ae0cd35ec91c903041f9bd32a36b89fd16fcab73f72554abcd82
MD5 660368c76d6f3b75fbab493eaf7ab62e
BLAKE2b-256 3b5aa51a8f3bfe5072bb3afe622f93982fa9fdde85469ca95bd1b25b6d6db691

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp313-cp313t-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp313-cp313t-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 8e02b75af5f784b6708f3465ba4ac3d46a8564fa1af7cd3163cca0c74b80fef0
MD5 f2cb75d1d324697982f4b88fa2f21749
BLAKE2b-256 f81da42aace912a3a8571ab8ce41e9bafc34e7a2e91a8bea305438ca6f3f01eb

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 71311951cac4f848b543c6a69758b5f8b2174d37890181d148754f6157998844
MD5 f16a31983426975385f2185c61ae05e1
BLAKE2b-256 f48b983695913c33804f120e9030e879419c5f36ce64016abc9ebf16d239cc7f

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-win32.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-win32.whl
Algorithm Hash digest
SHA256 071b5789fc394657b5cbeb5bd9df3c66e6b62abfeb4b6ae1462f3867e95a9727
MD5 c37534a583e14f1d76c5a95878699597
BLAKE2b-256 baffe680cc2ff320139520af8cd0ed8cf84e36f63ee400114c97ba8aaff6deae

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 c9ef4463ffdcbb8916735ed5b70164dccf085313b53f274a904ebd8a7674fa87
MD5 266a91dd27f4494400a8927b2bddbfce
BLAKE2b-256 0044708a7308062c73616d925dd881bbaae1b0c6bbb5dedc0eb7cd01b7dd49a6

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 e8a575f31c343f56387de407000ed2337cbc2d9c3775b3bd3eed7f5d6ec4f2d3
MD5 70b01cea4c5e9a57c00a50c770e2498f
BLAKE2b-256 cc1eb69f75af03002798bdc5dfa60e3dcdd64fc20ad28aebb86ecbc845be2539

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 8e54010c1b8a5de0ca0e16402bf7c4cd710df2ea7841bdeafacfe61bc7a8f28b
MD5 6f7334c32e2dc4f14a075de18f6e162f
BLAKE2b-256 3e21acc5af45c1d1ec087d564dff62016380d66c723bf4ffba121d81b769741b

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 36dfa380f44f87061b19b7cdee54a5aae0f21685a423cf3678645a1a2dba29d8
MD5 9e49866403f5c2d98d41676c429afb2a
BLAKE2b-256 a2a458367372ae392332f1a00cf26987db3bfce9220636ebf683e76df1c553cf

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 c8b1da29374dd2c44ef2d00e8c5678c0bbae431d6a0136b0a7992427f3b8dfc4
MD5 aded8828c0f75cf53552073e6f00e7fa
BLAKE2b-256 07936efc6e74055905f21206ee7e059aff9bba6f58a37d5fe904edf1db018d0f

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 fe4c56ab22bbe1ac36f4443d4565bb0d8a0d52ccc32a2cedc4640de3a8319f03
MD5 bbde716dcbbe4164ca791d497d868a48
BLAKE2b-256 a3d22d525cbb13de91f14bbed288514e78d8862c01c395682eccd97b5e01e118

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 6d3069ad707feb638ca155e791268bde435756eca47e69495fadd64df03e6635
MD5 def25eb383615004fa822d02c59fe145
BLAKE2b-256 4dffb8565c2720d92575fe42948026fd876cd4da27c750839ef94e5483ef0f76

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 ba47b60bfd7de3eda12cd0a74ced50be39dbb98ca024391480d41ae0408424bb
MD5 fa79c03e5125d90078c6a8683f5c003b
BLAKE2b-256 a53644737e4dc17a9f1fb02b145aeef4d4863fcec0fa109f014316aa85df7381

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 6d195228b4ceda39af608387231b2593825bbc1d85e25a036bf88f10c009eb9a
MD5 da1c8588e8683b868366a3f89527ece3
BLAKE2b-256 e05289685eb9a52bc43157e8394c0c8adb904c3f9e19d849e17cd8a672e3bcba

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d4a380b8e3d65b42b02e53301b92c17ea366ae834854c1f9b64f5bebb1c9ab4b
MD5 87c2196ba7ce03908aca4ad3d3f35016
BLAKE2b-256 c3eb68f01b03a279e03b276c387dcbaed391190d211d4ce352b2c15d304edf9d

See more details on using hashes here.

File details

Details for the file baseten_inference_client-0.0.1rc2-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for baseten_inference_client-0.0.1rc2-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 9bda3cf2c8b2166f1db7fe903cdcd8af571b8ff078d73770ffbcc7e3f8574066
MD5 4ebe4ed3eb2cb5fed53bd9e6c2ec0d63
BLAKE2b-256 9ab150c0c003fb9fa519d271e76705fa73319add9e65b7b80793b731cdc131d9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page