Skip to main content

Python client for Together's Cloud Platform!

Project description

The Together Python Library is the official Python client for Together's API platform, providing a convenient way for interacting with the Together APIs and enables easy integration of the inference API with your applications.

Installation

To install Together CLI , simply run:

pip install --upgrade together

Usage

🚧 You will need to create a free account with together.ai to obtain a Together API Key.

The Python Library requires your Together API Key to be configured. This key can be found in your Account's settings on the Playground. Simply click on and navigate to Profile Button > Settings > API Keys.

The API Key can be configured by either setting the TOGETHER_API_KEY environment variable, like this:

export TOGETHER_API_KEY=xxxxx

Or by setting together.api_key:

import together
together.api_key = "xxxxx"

Once you've provided your API key, you can browse our list of available models:

import together

# set your API key
together.api_key = "xxxxx"

# list available models and descriptons
models = together.Models.list()

# print the first model's name
print(models[0]['name'])

Let's start an instance of one of the models in the list above. You can also start an instance by clicking play on any model in the models playground.

together.Models.start("togethercomputer/RedPajama-INCITE-7B-Base")

Once you've started a model instance, you can start querying:

import together

# set your API key
together.api_key = "xxxxx"

# list available models and descriptons
models = together.Models.list()

# print the first model's name
print(models[0]['name'])

output = together.Complete.create("Space robots", model="togethercomputer/RedPajama-INCITE-7B-Base")

# print generated text
print(output['output']['choices'][0]['text'])

Check which models have been started or stopped:

together.Models.instances()

To stop your model instance:

together.Models.stop("togethercomputer/RedPajama-INCITE-7B-Base")

Chat

The chat command is a CLI-based chat application that can be used for back-and-forth conversations with models in a pre-defined format.

Refer to the Chat docs on how to chat with your favorite models.

Complete

The complete command can be used to inference with all the models available in the Together Playground. This is recommended for custom applications and raw queries. It provides all the functions you need to run inference on all the leading open-source models available with the Together API. You can use these functions by interacting with the command line utility from your terminal or for usage in your custom Python applications.

Refer to the Complete docs on how you can query these models.

Image

The image command can be used to generate images from the leading open-source image generation models available with the Together API. You can use these functions by interacting with the command line utility from your terminal or for usage in your custom Python applications.

Refer to the Image docs on how you can generate images.

Files

Files are used for uploading training and validation datasets that are used for fine-tuning.

Refer to the Files docs on the correct way to prepare your files and managing them.

Fine-tuning

Run and manage your fine-tuning jobs, enabling you to tune all model layers, control hyper-parameters, download the weights and checkpoints.

Refer to the Fine-tuning docs on how to get started.

Command-line interface

All the above commands are also available through a CLI:

# list commands
together --help

# list available models
together models list

# start a model
together models start togethercomputer/RedPajama-INCITE-7B-Base

# create completion
together complete "Space robots" -m togethercomputer/RedPajama-INCITE-7B-Base

# check which models are running
together models instances

# stop a model
together models stop togethercomputer/RedPajama-INCITE-7B-Base

Contributing

  1. Clone the repo and make your changes
  2. Run pip install together['quality']
  3. From the root of the repo, run
    • black .
    • ruff .
      • And if necessary, ruff . --fix
    • mypy --strict .
  4. Create a PR

Project details


Release history Release notifications | RSS feed

This version

0.1.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

together-0.1.2.tar.gz (20.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

together-0.1.2-py3-none-any.whl (31.7 kB view details)

Uploaded Python 3

File details

Details for the file together-0.1.2.tar.gz.

File metadata

  • Download URL: together-0.1.2.tar.gz
  • Upload date:
  • Size: 20.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.5

File hashes

Hashes for together-0.1.2.tar.gz
Algorithm Hash digest
SHA256 b7e05e2aad6349b143121047f533440f554e8851df39cfc58896c1965ff36740
MD5 0ad4288965253cd9efa96f378dd9cbfe
BLAKE2b-256 58433fdea22dad0f6523400c464ee05cc1fa32325329ed6915ae17d8242b370a

See more details on using hashes here.

File details

Details for the file together-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: together-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 31.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.5

File hashes

Hashes for together-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 48fbb2e3903313bd82638af075e5db2fe65f7c7f124e4df819e1ab994cc8bcd5
MD5 d76f715e639c6afd864ebb693b324dde
BLAKE2b-256 453d61cf9f0502f7ec0db0229744dda8563f91cd3986339129056e912c436a51

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page