Skip to main content

Python API client for Bytez service

Project description

API Documentation

Introduction

Welcome to the Bytez API documentation! This API provides access to various machine learning models for serverless operation. Below, you will find examples demonstrating how to interact with the API using our Python client library.

Python Client Library Usage Examples

Authentication

Getting Your Key

To use this API, you need an API key. Obtain your key by visitng the settings page on Bytez.

Always include your API key when initializing the client:

from bytez import Bytez

client = Bytez('YOUR_API_KEY')

List Available Models

Lists the currently available models, and provides basic information about each one, such as RAM required

models = client.list_models()

print(models)

List Serverless Instances

List your serverless instances

instances = client.list_instances()

print(instances)

Make a Model Serverless

Make a HuggingFace model serverless + available on this API! Running this command queues a job. You'll receive an email when the model is ready.

@param modelId The HuggingFace modelId, for example openai-community/gpt2

model_id = 'openai-community/gpt2'

job_status = client.process(model_id)

print(job_status)

Get a Model

Get a model, so you can check its status, load, run, or shut it down.

@param modelId The HuggingFace modelId, for example openai-community/gpt2

model = client.model('openai-community/gpt2')

Start the model

Convenience method for running model.start(), and then awaiting model to be ready.

@param options Serverless configuration

model.load()

## serverless params by default are {'concurrency': 1, 'timeout': 300}
# Concurrency
# Number of serverless instances.
#
# For example, if you set to `3`, then you can do 3 parallel inferences.
#
# If you set to `1`, then you can do 1 inference at a time.
#
# Default: `1`

# Timeout
# Seconds to wait before serverless instance auto-shuts down.
#
# By default, if an instance doesn't receive a request after `300` seconds, then it shuts down.
#
# Receiving a request resets this timer.
#
# Default: `300`

Check Model Status

Check on the status of the model, to see if its deploying, running, or stopped

status = model.status()

print(status)

Run a Model

Run inference

output = model.run("Once upon a time there was a")

print(output)

Run a Model with HuggingFace params

Run inference with HuggingFace parameters.

output = model.run("Once upon a time there was a", model_params={"max_new_tokens":1,"min_new_tokens":1})

print(output)

Stream the response

Streaming text

output = model.run("Once upon a time there was a", stream=True)

for chunk in stream:
  print(chunk)

Shutdown a Model

Serverless models auto-shutdown, though you can early stop with this method

model.stop()

Feedback

We value your feedback to improve our documentation and services. If you have any suggestions, please join our Discord or contact us via email.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bytez-0.2.26.tar.gz (5.4 kB view details)

Uploaded Source

Built Distribution

bytez-0.2.26-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file bytez-0.2.26.tar.gz.

File metadata

  • Download URL: bytez-0.2.26.tar.gz
  • Upload date:
  • Size: 5.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for bytez-0.2.26.tar.gz
Algorithm Hash digest
SHA256 a3184f7fc6bac3ba05f0e45b83dcd5bf842474d4aa5e88c3afb1d0f9b64de09d
MD5 dbc40c5a7b1f1953fa94e3645bc33f34
BLAKE2b-256 4768e285f98ebcabd5c3ec93f3c2865c4a9294e793fd1d167852f068719cf3c1

See more details on using hashes here.

File details

Details for the file bytez-0.2.26-py3-none-any.whl.

File metadata

  • Download URL: bytez-0.2.26-py3-none-any.whl
  • Upload date:
  • Size: 5.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.18

File hashes

Hashes for bytez-0.2.26-py3-none-any.whl
Algorithm Hash digest
SHA256 b9a13f22ed5fa9bfc1f21a40a75e143cbc73e360581fe28e11546acfa9a82f8d
MD5 ed983d06be7ffb7b718771ee5fe8e7f4
BLAKE2b-256 373d224fca046c9dfa08fcc73479d19f0ce864db5d9a5ccbc45b949dfee1fd54

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page