Skip to main content

Python client for Together's Cloud Platform!

Project description

The Together Python Library is the official Python client for Together's API platform, providing a convenient way for interacting with the Together APIs and enables easy integration of the inference API with your applications.

Installation

To install Together CLI , simply run:

pip install --upgrade together

Usage

🚧 You will need to create a free account with together.ai to obtain a Together API Key.

The Python Library requires your Together API Key to be configured. This key can be found in your Account's settings on the Playground. Simply click on and navigate to Profile Button > Settings > API Keys.

The API Key can be configured by either setting the TOGETHER_API_KEY environment variable, like this:

export TOGETHER_API_KEY=xxxxx

Or by setting together.api_key:

import together
together.api_key = "xxxxx"

Once you've provided your API key, you can browse our list of available models:

import together

# set your API key
together.api_key = "xxxxx"

# list available models and descriptons
models = together.Models.list()

# print the first model's name
print(models[0]['name'])

Let's start an instance of one of the models in the list above. You can also start an instance by clicking play on any model in the models playground.

together.Models.start("togethercomputer/RedPajama-INCITE-7B-Base")

Once you've started a model instance, you can start querying:

import together

# set your API key
together.api_key = "xxxxx"

# list available models and descriptons
models = together.Models.list()

# print the first model's name
print(models[0]['name'])

output = together.Complete.create("Space robots", model="togethercomputer/RedPajama-INCITE-7B-Base")

# print generated text
print(output['output']['choices'][0]['text'])

Check which models have been started or stopped:

together.Models.instances()

To stop your model instance:

together.Models.stop("togethercomputer/RedPajama-INCITE-7B-Base")

Chat

The chat command is a CLI-based chat application that can be used for back-and-forth conversations with models in a pre-defined format.

Refer to the Chat docs on how to chat with your favorite models.

Complete

The complete command can be used to inference with all the models available in the Together Playground. This is recommended for custom applications and raw queries. It provides all the functions you need to run inference on all the leading open-source models available with the Together API. You can use these functions by interacting with the command line utility from your terminal or for usage in your custom Python applications.

Refer to the Complete docs on how you can query these models.

Image

The image command can be used to generate images from the leading open-source image generation models available with the Together API. You can use these functions by interacting with the command line utility from your terminal or for usage in your custom Python applications.

Refer to the Image docs on how you can generate images.

Files

Files are used for uploading training and validation datasets that are used for fine-tuning.

Refer to the Files docs on the correct way to prepare your files and managing them.

Fine-tuning

Run and manage your fine-tuning jobs, enabling you to tune all model layers, control hyper-parameters, download the weights and checkpoints.

Refer to the Fine-tuning docs on how to get started.

Command-line interface

All the above commands are also available through a CLI:

# list commands
together --help

# list available models
together models list

# start a model
together models start togethercomputer/RedPajama-INCITE-7B-Base

# create completion
together complete "Space robots" -m togethercomputer/RedPajama-INCITE-7B-Base

# check which models are running
together models instances

# stop a model
together models stop togethercomputer/RedPajama-INCITE-7B-Base

Contributing

  1. Clone the repo and make your changes
  2. Run pip install together['quality']
  3. From the root of the repo, run
    • black .
    • ruff .
      • And if necessary, ruff . --fix
    • mypy --strict .
  4. Create a PR

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

together-0.1.3.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

together-0.1.3-py3-none-any.whl (32.1 kB view details)

Uploaded Python 3

File details

Details for the file together-0.1.3.tar.gz.

File metadata

  • Download URL: together-0.1.3.tar.gz
  • Upload date:
  • Size: 23.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for together-0.1.3.tar.gz
Algorithm Hash digest
SHA256 7eb7761a5ed85ba89e08aa6138df45afe97bfbde4656f572967832cb7da9d472
MD5 b779a8677df448b8fa560f89503cdc33
BLAKE2b-256 2df3290573ee479b7a2c0da5d59ab46d51a3422582528fa8b96f3e116d244633

See more details on using hashes here.

File details

Details for the file together-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: together-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 32.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for together-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 208e2eb847d02fc5ad43e950a842ba253fa1aa20831ebed21deac5e5aa38fd70
MD5 e19da02c14ba3f322995ed943bc32f37
BLAKE2b-256 86357cf7540441e5dc6ab3091ebe4f66dcff54fde131cd07dd85e8a170a1ff3b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page