Skip to main content

Python API client for Bytez service

Project description

API Documentation

Introduction

Welcome to the Bytez API documentation! This API provides access to various machine learning models for serverless operation. Below, you will find examples demonstrating how to interact with the API using our Python client library.

Python Client Library Usage Examples

Authentication

Getting Your Key

To use this API, you need an API key. Obtain your key by visitng the settings page on Bytez.

Always include your API key when initializing the client:

from bytez import Bytez

client = Bytez('YOUR_API_KEY')

List Available Models

Lists the currently available models, and provides basic information about each one, such as RAM required

models = client.list_models()

print(models)

List Serverless Instances

List your serverless instances

instances = client.list_instances()

print(instances)

Make a Model Serverless

Make a HuggingFace model serverless + available on this API! Running this command queues a job. You'll receive an email when the model is ready.

@param modelId The HuggingFace modelId, for example openai-community/gpt2

model_id = 'openai-community/gpt2'

job_status = client.process(model_id)

print(job_status)

Get a Model

Get a model, so you can check its status, load, run, or shut it down.

@param modelId The HuggingFace modelId, for example openai-community/gpt2

model = client.model('openai-community/gpt2')

Start the model

Convenience method for running model.start(), and then awaiting model to be ready.

@param options Serverless configuration

model.load()

## serverless params by default are {'concurrency': 1, 'timeout': 300}
# Concurrency
# Number of serverless instances.
#
# For example, if you set to `3`, then you can do 3 parallel inferences.
#
# If you set to `1`, then you can do 1 inference at a time.
#
# Default: `1`

# Timeout
# Seconds to wait before serverless instance auto-shuts down.
#
# By default, if an instance doesn't receive a request after `300` seconds, then it shuts down.
#
# Receiving a request resets this timer.
#
# Default: `300`

Check Model Status

Check on the status of the model, to see if its deploying, running, or stopped

status = model.status()

print(status)

Run a Model

Run inference

output = model.run("Once upon a time there was a")

print(output)

Run a Model with HuggingFace params

Run inference with HuggingFace parameters.

output = model.run("Once upon a time there was a", model_params={"max_new_tokens":1,"min_new_tokens":1})

print(output)

Stream the response

Streaming text

output = model.run("Once upon a time there was a", stream=True)

for chunk in stream:
  print(chunk)

Shutdown a Model

Serverless models auto-shutdown, though you can early stop with this method

model.stop()

Feedback

We value your feedback to improve our documentation and services. If you have any suggestions, please join our Discord or contact us via email.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bytez-0.2.21.tar.gz (5.4 kB view details)

Uploaded Source

Built Distribution

bytez-0.2.21-py3-none-any.whl (5.8 kB view details)

Uploaded Python 3

File details

Details for the file bytez-0.2.21.tar.gz.

File metadata

  • Download URL: bytez-0.2.21.tar.gz
  • Upload date:
  • Size: 5.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for bytez-0.2.21.tar.gz
Algorithm Hash digest
SHA256 db011b7373fc1db5b154ba4a4084b7bff8145c7b8400f734f979b3b9fbeffe99
MD5 de1a762d7f8da317855ebdf8f0e65fdb
BLAKE2b-256 09b358d1f652bcbc741cfc3fd937300c88bc010685979b35b3f6e048b736251a

See more details on using hashes here.

File details

Details for the file bytez-0.2.21-py3-none-any.whl.

File metadata

  • Download URL: bytez-0.2.21-py3-none-any.whl
  • Upload date:
  • Size: 5.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for bytez-0.2.21-py3-none-any.whl
Algorithm Hash digest
SHA256 03fbc2c696b8d3d3025e3f5172bed32a96b5f77873d82d40dc3f14c1fe115554
MD5 4657e930f255f5f830792fcd43e6d05c
BLAKE2b-256 2bedf354fb21a80fde57b0fb1973c1d70f61717c3b4345018fbd0c7281a353ae

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page