Skip to main content

Access large language models from the command-line

Project description

llm

PyPI Changelog Tests License

Access large language models from the command-line

Installation

Install this tool using pip:

pip install llm

You need an OpenAI API key, which should either be set in the OPENAI_API_KEY environment variable, or saved in a plain text file called ~/.openai-api-key.txt in your home directory.

Usage

The default command for this is llm chatgpt - you can use llm instead if you prefer.

To run a prompt:

llm 'Ten names for cheesecakes'

To stream the results a token at a time:

llm 'Ten names for cheesecakes' -s

To switch from ChatGPT 3.5 (the default) to GPT-4 if you have access:

llm 'Ten names for cheesecakes' -4

Pass --model <model name> to use a different model.

You can also send a prompt to standard input, for example:

echo 'Ten names for cheesecakes' | llm

Using with a shell

To generate a description of changes made to a Git repository since the last commit:

llm "Describe these changes: $(git diff)"

This pattern of using $(command) inside a double quoted string is a useful way to quickly assemble prompts.

System prompts

You can use --system '...' to set a system prompt.

llm 'SQL to calculate total sales by month' -s \
  --system 'You are an exaggerated sentient cheesecake that knows SQL and talks about cheesecake a lot'

This is useful for piping content to standard input, for example:

curl -s 'https://simonwillison.net/2023/May/15/per-interpreter-gils/' | \
  llm --system 'Suggest topics for this post as a JSON array' --stream

The --code option will set a system prompt for you that attempts to output just code without explanation, and will strip off any leading or trailing markdown code block syntax. You can use this to generate code and write it straight to a file:

llm 'Python CLI tool: reverse string passed to stdin' --code > fetch.py

Be very careful executing code generated by a LLM - always read it first!

Logging to SQLite

If a SQLite database file exists in ~/.llm/log.db then the tool will log all prompts and responses to it.

You can create that file by running the init-db command:

llm init-db

Now any prompts you run will be logged to that database.

To avoid logging a prompt, pass --no-log or -n to the command:

llm 'Ten names for cheesecakes' -n

Viewing the logs

You can view the logs using the llm logs command:

llm logs

This will output the three most recent logged items as a JSON array of objects.

Add -n 10 to see the ten most recent items:

llm logs -n 10

Or -n 0 to see everything that has ever been logged:

llm logs -n 0

You can also use Datasette to browse your logs like this:

datasette ~/.llm/log.db

Help

For help, run:

llm --help

You can also use:

python -m llm --help

Development

To contribute to this tool, first checkout the code. Then create a new virtual environment:

cd llm
python -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-0.3.tar.gz (9.2 kB view details)

Uploaded Source

Built Distribution

llm-0.3-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file llm-0.3.tar.gz.

File metadata

  • Download URL: llm-0.3.tar.gz
  • Upload date:
  • Size: 9.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for llm-0.3.tar.gz
Algorithm Hash digest
SHA256 bc945eaf45ff42595990320c4b1735749e355f3eedbcc279ba5aacaf18a559af
MD5 3d6ea1d23f74dc35a255a06778ec59a9
BLAKE2b-256 69167edb4bc510f4cc69267cc3c4ebe65e1b7fed276cd07cf74752887a644f64

See more details on using hashes here.

File details

Details for the file llm-0.3-py3-none-any.whl.

File metadata

  • Download URL: llm-0.3-py3-none-any.whl
  • Upload date:
  • Size: 9.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for llm-0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 cdd607aba75252d1795d710951402e25b088e10e3a32bc9864a4a7e0ac2fe329
MD5 e91531139a988d86c4c0a641a9660770
BLAKE2b-256 46cf2d59943fa7c79960087a82d041094882e15634ced827f81bf86826a0d07d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page