Skip to main content

LLM plugin for models hosted by Anyscale Endpoints

Project description

llm-anyscale-endpoints

PyPI Changelog Tests License

LLM plugin for models hosted by Anyscale Endpoints

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM.

llm install llm-anyscale-endpoints

Configuration

You will need an API key from Anyscale Endpoints. You can obtain one here.

You can set that as an environment variable called LLM_ANYSCALE_ENDPOINTS_KEY, or add it to the llm set of saved keys using:

llm keys set anyscale-endpoints
Enter key: <paste key here>

Usage

To list available models, run:

llm models list

You should see a list that looks something like this:

AnyscaleEndpoints: meta-llama/Llama-2-7b-chat-hf
AnyscaleEndpoints: meta-llama/Llama-2-13b-chat-hf
AnyscaleEndpoints: mistralai/Mixtral-8x7B-Instruct-v0.1
AnyscaleEndpoints: mistralai/Mistral-7B-Instruct-v0.1
AnyscaleEndpoints: meta-llama/Llama-2-70b-chat-hf
AnyscaleEndpoints: codellama/CodeLlama-70b-Instruct-hf
AnyscaleEndpoints: mistralai/Mixtral-8x22B-Instruct-v0.1
AnyscaleEndpoints: mlabonne/NeuralHermes-2.5-Mistral-7B
AnyscaleEndpoints: google/gemma-7b-it

To run a prompt against a model, pass its full model ID to the -m option, like this:

llm -m mistralai/Mixtral-8x22B-Instruct-v0.1 \
  'Five strident names for a pet walrus' \
  --system 'You love coming up with creative names for pets'

You can set a shorter alias for a model using the llm aliases command like so:

llm aliases set mix22b mistralai/Mixtral-8x22B-Instruct-v0.1

Now you can prompt Mixtral-8x22B-Instruct-v0.1 using the alias mix22b:

cat llm_anyscale_endpoints.py | \
  llm -m mix22b -s 'explain this code'

You can refresh the list of models by running:

llm anyscale-endpoints refresh

This will fetch the latest list of models from Anyscale Endpoints and story it in a local cache file.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-anyscale-endpoints
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_anyscale_endpoints-0.5.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

llm_anyscale_endpoints-0.5-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file llm_anyscale_endpoints-0.5.tar.gz.

File metadata

  • Download URL: llm_anyscale_endpoints-0.5.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for llm_anyscale_endpoints-0.5.tar.gz
Algorithm Hash digest
SHA256 bc99508b8bf4b6ef5d482a195202e4e966edc0977fd4d3b474bc13f93fb4c110
MD5 2a3722af19ea6e6e31aa62c89bd3b508
BLAKE2b-256 49888469d98dd0836cf9eba550bd84b0c84f5b77b882ce42647f4c8d57286f6f

See more details on using hashes here.

File details

Details for the file llm_anyscale_endpoints-0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_anyscale_endpoints-0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 80ac5d14f918dc54ab1913fab99fdac06a8a4b1e0ab06d5529dad3a150234285
MD5 ac5906b522b3efc932d33b724bd4cf7c
BLAKE2b-256 c771017662ac517fad65f955c7b4a701602b537d0f50bd562812af5f78d0c1da

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page