An LLM CLI plugin for Cloudflare Workers AI models.
Project description
llm-cloudflare
A plugin for the llm
CLI that allows you to use the text generation models (LLMs) running on globally on Cloudflare Workers AI, including models like Llama 3.1, Mistral 7B, Gemma and a number of task-specific fine tunes.
llm-cloudflare
is useful for:
- Using and building with LLMs that may not efficiently run on your local machine (limited GPU, memory, etc) vs. having Workers AI run it on a GPU near you.
- Validating the performance of and/or comparing multiple models.
- Experimenting without needing to download models ahead-of-time.
Usage
Prerequisite: You'll need the llm
CLI installed first.
Install and setup the plugin:
# Install the plugin from pip
llm install llm-cloudflare
# Provide a valid Workers AI token
# Docs: https://developers.cloudflare.com/workers-ai/get-started/rest-api/#1-get-api-token-and-account-id
llm keys set cloudflare
# Set your Cloudflare account ID
# Docs: https://developers.cloudflare.com/workers-ai/get-started/rest-api/#1-get-api-token-and-account-id
export CLOUDFLARE_ACCOUNT_ID="33charlonghexstringhere"
Use it by specifying a Workers AI model:
llm -m "@cf/meta/llama-3.1-8b-instruct" "Write a Cloudflare Worker in ESM format that returns an empty JSON object as a response. Show only the code."
You can set a Workers AI model as the default model in llm
:
# Set Llama 3.1 8B as the default
llm models default "@cf/meta/llama-3.1-8b-instruct"
# See what model is set as the default
llm models default
# @cf/meta/llama-3.1-8b-instruct
Available models
This plugin provides access to the text generation models (LLMs) provided by Workers AI.
To see what models are available, invoke llm models
. Models prefixed with Cloudflare Workers AI
are provided by this plugin.
The supported models are generated by scripts. New models thus rely on this plugin being updated periodically.
In the future, this plugin may also add support for Workers AI's embedding models for use with llm embed
.
Credits
Credit to @hex for https://github.com/hex/llm-perplexity, which heavily inspired the design of this plugin.
License
Copyright Cloudflare, Inc (2024). Apache-2.0 licensed. See the LICENSE file for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llm_cloudflare-0.5.2.tar.gz
.
File metadata
- Download URL: llm_cloudflare-0.5.2.tar.gz
- Upload date:
- Size: 8.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2a8d0bc6fd274c8e26395bbe9f9aab98398725f8274d57d7eb42b8ee629550a7 |
|
MD5 | 5e18dc32ff998a63c1840a49d037bbab |
|
BLAKE2b-256 | 1fa992b24c56cf7c8fe48ea9d57a546af0f4bc712c6b8c3b7c4f279070fbf30d |
Provenance
The following attestation bundles were made for llm_cloudflare-0.5.2.tar.gz
:
Publisher:
workflow.yml
on elithrar/llm-cloudflare
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
llm_cloudflare-0.5.2.tar.gz
- Subject digest:
2a8d0bc6fd274c8e26395bbe9f9aab98398725f8274d57d7eb42b8ee629550a7
- Sigstore transparency entry: 145230133
- Sigstore integration time:
- Predicate type:
File details
Details for the file llm_cloudflare-0.5.2-py3-none-any.whl
.
File metadata
- Download URL: llm_cloudflare-0.5.2-py3-none-any.whl
- Upload date:
- Size: 8.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5ccf0db7635adca0b09fbc7fdaad6b49ff8379d82b9bf5947a1132561492abe5 |
|
MD5 | 46a23520a4afa431be38c07d20353215 |
|
BLAKE2b-256 | 2082e2f4bd8d7c6dcf44d315b410bc568878f2725fc016b0abdcda3ab1df0a7c |
Provenance
The following attestation bundles were made for llm_cloudflare-0.5.2-py3-none-any.whl
:
Publisher:
workflow.yml
on elithrar/llm-cloudflare
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
llm_cloudflare-0.5.2-py3-none-any.whl
- Subject digest:
5ccf0db7635adca0b09fbc7fdaad6b49ff8379d82b9bf5947a1132561492abe5
- Sigstore transparency entry: 145230134
- Sigstore integration time:
- Predicate type: