Skip to main content

CLI utility and Python library for interacting with Large Language Models from organizations like OpenAI, Anthropic and Gemini plus local models installed on your own machine.

Project description

LLM

GitHub repo PyPI Changelog Tests License Discord Homebrew

A CLI tool and Python library for interacting with OpenAI, Anthropic’s Claude, Google’s Gemini, Meta’s Llama and dozens of other Large Language Models, both via remote APIs and with models that can be installed and run on your own machine.

Watch Language models on the command-line on YouTube for a demo or read the accompanying detailed notes.

With LLM you can:

Quick start

First, install LLM using pip or Homebrew or pipx or uv:

pip install llm

Or with Homebrew (see warning note):

brew install llm

Or with pipx:

pipx install llm

Or with uv

uv tool install llm

If you have an OpenAI API key key you can run this:

# Paste your OpenAI API key into this
llm keys set openai

# Run a prompt (with the default gpt-4o-mini model)
llm "Ten fun names for a pet pelican"

# Extract text from an image
llm "extract text" -a scanned-document.jpg

# Use a system prompt against a file
cat myfile.py | llm -s "Explain this code"

Run prompts against Gemini or Anthropic with their respective plugins:

llm install llm-gemini
llm keys set gemini
# Paste Gemini API key here
llm -m gemini-2.0-flash 'Tell me fun facts about Mountain View'

llm install llm-anthropic
llm keys set anthropic
# Paste Anthropic API key here
llm -m claude-4-opus 'Impress me with wild facts about turnips'

You can also install a plugin to access models that can run on your local device. If you use Ollama:

# Install the plugin
llm install llm-ollama

# Download and run a prompt against the Orca Mini 7B model
ollama pull llama3.2:latest
llm -m llama3.2:latest 'What is the capital of France?'

To start an interactive chat with a model, use llm chat:

llm chat -m gpt-4.1
Chatting with gpt-4.1
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
Type '!edit' to open your default editor and modify the prompt.
Type '!fragment <my_fragment> [<another_fragment> ...]' to insert one or more fragments
> Tell me a joke about a pelican
Why don't pelicans like to tip waiters?

Because they always have a big bill!

More background on this project:

See also the llm tag on my blog.

Contents

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-0.26.tar.gz (82.0 kB view details)

Uploaded Source

Built Distribution

llm-0.26-py3-none-any.whl (79.4 kB view details)

Uploaded Python 3

File details

Details for the file llm-0.26.tar.gz.

File metadata

  • Download URL: llm-0.26.tar.gz
  • Upload date:
  • Size: 82.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llm-0.26.tar.gz
Algorithm Hash digest
SHA256 c2e9ddbc582da10c61112c0f983383fa0dc7bee3ca7d6f2ade1a2d003771eb1b
MD5 6b951f2fcf564491096dac1e9cf777a5
BLAKE2b-256 4e881acdae8cc6122d3ab2d3d30477d3273608f92177a6bb17ab98881af65627

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm-0.26.tar.gz:

Publisher: publish.yml on simonw/llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm-0.26-py3-none-any.whl.

File metadata

  • Download URL: llm-0.26-py3-none-any.whl
  • Upload date:
  • Size: 79.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llm-0.26-py3-none-any.whl
Algorithm Hash digest
SHA256 93dc331facbfb909f946df8812fb1d627ec9ce764aab2ebd29f9011ed647339c
MD5 e2c6adffa83908b609680a5aad1bc1f4
BLAKE2b-256 da5fed9d5c47c46632f2c3a844c14412f434bc1263faa4890306b5f911cd3aaf

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm-0.26-py3-none-any.whl:

Publisher: publish.yml on simonw/llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page