Skip to main content

LLM plugin for models hosted by Anyscale Endpoints

Project description

llm-anyscale-endpoints

PyPI Changelog Tests License

LLM plugin for models hosted by Anyscale Endpoints

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM.

llm install llm-anyscale-endpoints

Configuration

You will need an API key from Anyscale Endpoints. You can obtain one here.

You can set that as an environment variable called LLM_ANYSCALE_ENDPOINTS_KEY, or add it to the llm set of saved keys using:

llm keys set anyscale-endpoints
Enter key: <paste key here>

Usage

To list available models, run:

llm models list

You should see a list that looks something like this:

AnyscaleEndpoints: meta-llama/Llama-2-7b-chat-hf
AnyscaleEndpoints: meta-llama/Llama-2-13b-chat-hf
AnyscaleEndpoints: mistralai/Mixtral-8x7B-Instruct-v0.1
AnyscaleEndpoints: mistralai/Mistral-7B-Instruct-v0.1
AnyscaleEndpoints: meta-llama/Llama-2-70b-chat-hf
AnyscaleEndpoints: codellama/CodeLlama-70b-Instruct-hf
AnyscaleEndpoints: mistralai/Mixtral-8x22B-Instruct-v0.1
AnyscaleEndpoints: mlabonne/NeuralHermes-2.5-Mistral-7B
AnyscaleEndpoints: google/gemma-7b-it

To run a prompt against a model, pass its full model ID to the -m option, like this:

llm -m mistralai/Mixtral-8x22B-Instruct-v0.1 \
  'Five strident names for a pet walrus' \
  --system 'You love coming up with creative names for pets'

You can set a shorter alias for a model using the llm aliases command like so:

llm aliases set mix22b mistralai/Mixtral-8x22B-Instruct-v0.1

Now you can prompt Mixtral-8x22B-Instruct-v0.1 using the alias mix22b:

cat llm_anyscale_endpoints.py | \
  llm -m mix22b -s 'explain this code'

You can refresh the list of models by running:

llm anyscale-endpoints refresh

This will fetch the latest list of models from Anyscale Endpoints and story it in a local cache file.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-anyscale-endpoints
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_anyscale_endpoints-0.6.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

llm_anyscale_endpoints-0.6-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file llm_anyscale_endpoints-0.6.tar.gz.

File metadata

  • Download URL: llm_anyscale_endpoints-0.6.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for llm_anyscale_endpoints-0.6.tar.gz
Algorithm Hash digest
SHA256 11c0c1ed42a6c16a25a92f4f756f122c39083af397f167e11b63151d060e1aa7
MD5 4a548921405f366fa4d94b0558c21e33
BLAKE2b-256 9d3348b9458838b148674f2abb4d847ea917939e8c9abe162ff2216f2513f893

See more details on using hashes here.

File details

Details for the file llm_anyscale_endpoints-0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_anyscale_endpoints-0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 12bb01ccd13d98153c584b57a5ac765b0f15996d236811ae4ee612e512a685b8
MD5 a4a2e6062e4f4faeecf2d7a4dbde1d88
BLAKE2b-256 191c64aea5735fca6630d9c8b2e6181996e87a49329babc1c75cc015fd0f812b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page