Skip to main content

LLM plugin for models hosted by Together AI

Project description

llm-togetherai

PyPI Changelog Tests License

LLM plugin for models hosted by Together AI

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM.

llm install llm-togetherai

Configuration

You will need an API key from Together AI. You can obtain one here.

You can set that as an environment variable called TOGETHER_API_KEY, or add it to the llm set of saved keys using:

llm keys set together
Enter key: <paste key here>

Usage

To list available models, run:

llm models list

You should see a list that looks something like this:

together: together/meta-llama/Llama-2-7b-chat-hf
together: together/meta-llama/Llama-2-13b-chat-hf
together: together/meta-llama/Llama-2-70b-chat-hf
together: together/mistralai/Mistral-7B-Instruct-v0.1
together: together/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
...

To run a prompt against a model, pass its full model ID to the -m option, like this:

llm -m together/meta-llama/Llama-2-7b-chat-hf "Five creative names for a pet robot"

You can set a shorter alias for a model using the llm aliases command like so:

llm aliases set llama2-7b together/meta-llama/Llama-2-7b-chat-hf

Now you can prompt the model using:

cat llm_togetherai.py | llm -m llama2-7b -s 'write some pytest tests for this'

Vision models

Some Together AI models can accept image attachments. Run this command:

llm models --options -q together

And look for models that list these attachment types:

  Attachment types:
    image/gif, image/jpeg, image/png, image/webp

You can feed these models images as URLs or file paths, for example:

curl https://static.simonwillison.net/static/2024/pelicans.jpg | llm \
    -m together/meta-llama/Llama-3.2-11B-Vision-Instruct-Turbo 'describe this image' -a -

Listing models

The llm models -q together command will display all available models, or you can use this command to see more detailed information:

llm together models

Output starts like this:

- id: meta-llama/Llama-2-7b-chat-hf
  name: Llama-2-7b-chat-hf
  context_length: 4,096
  type: chat
  organization: Together
  pricing: input $0.2/M, output $0.2/M

- id: meta-llama/Llama-2-13b-chat-hf
  name: Llama-2-13b-chat-hf
  context_length: 4,096
  type: chat
  organization: Together
  pricing: input $0.3/M, output $0.3/M

Add --json to get back JSON instead:

llm together models --json

Refreshing the model cache

The plugin caches the list of available models for 1 hour. To refresh this cache manually:

llm together refresh

This will fetch the latest models from the Together AI API and update the local cache.

API Endpoint

This plugin uses the Together AI API endpoint:

https://api.together.xyz/v1/models

The models are cached locally in your LLM user directory to improve performance and reduce API calls.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-togetherai
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

pytest

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_togetherai-0.1.0.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_togetherai-0.1.0-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file llm_togetherai-0.1.0.tar.gz.

File metadata

  • Download URL: llm_togetherai-0.1.0.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.6

File hashes

Hashes for llm_togetherai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 57464f0c9a40476af05965b132c89e9256f8882718f3ae096a1537fc06805941
MD5 b71829fee4938c88fac73bd406891142
BLAKE2b-256 104ded3209596ae41ded30510c38d0fcdbacd728a02bdc2a22112e73a8181066

See more details on using hashes here.

File details

Details for the file llm_togetherai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: llm_togetherai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.6

File hashes

Hashes for llm_togetherai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fdfd41bb613d9811cd8570d67ab45556ee618d52d3b45888e1bc56fcf4928d90
MD5 5a806f59a1d358bd4bf7bb8822657052
BLAKE2b-256 879b2070418004dbdb63761d4d32870f328ed9ee247bab8736196ffcf1276036

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page