Skip to main content

A CLI utility and Python library for interacting with Large Language Models, including OpenAI, PaLM and local models installed on your own machine.

Project description

LLM

PyPI Documentation Changelog Tests License Discord Homebrew

A CLI utility and Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine.

Run prompts from the command-line, store the results in SQLite, generate embeddings and more.

Full documentation: llm.datasette.io

Background on this project:

Installation

Install this tool using pip:

pip install llm

Or using pipx:

pipx install llm

Detailed installation instructions.

Getting started

If you have an OpenAI API key you can get started using the OpenAI models right away.

As an alternative to OpenAI, you can install plugins to access models by other providers, including models that can be installed and run on your own device.

Save your OpenAI API key like this:

llm keys set openai

This will prompt you for your key like so:

Enter key: <paste here>

Now that you've saved a key you can run a prompt like this:

llm "Five cute names for a pet penguin"
1. Waddles
2. Pebbles
3. Bubbles
4. Flappy
5. Chilly

Read the usage instructions for more.

Installing a model that runs on your own machine

LLM plugins can add support for alternative models, including models that run on your own machine.

To download and run Llama 2 13B locally, you can install the llm-mlc plugin:

llm install llm-mlc
llm mlc pip install --pre --force-reinstall \
  mlc-ai-nightly \
  mlc-chat-nightly \
  -f https://mlc.ai/wheels
llm mlc setup

Then download the 15GB Llama 2 13B model like this:

llm mlc download-model Llama-2-13b-chat --alias llama2

And run a prompt through it:

llm -m llama2 'difference between a llama and an alpaca'

You can also start a chat session with the model using the llm chat command:

llm chat -m llama2
Chatting with mlc-chat-Llama-2-13b-chat-hf-q4f16_1
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> 

Using a system prompt

You can use the -s/--system option to set a system prompt, providing instructions for processing other input to the tool.

To describe how the code a file works, try this:

cat mycode.py | llm -s "Explain this code"

Help

For help, run:

llm --help

You can also use:

python -m llm --help

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-0.11.1.tar.gz (35.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm-0.11.1-py3-none-any.whl (36.7 kB view details)

Uploaded Python 3

File details

Details for the file llm-0.11.1.tar.gz.

File metadata

  • Download URL: llm-0.11.1.tar.gz
  • Upload date:
  • Size: 35.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for llm-0.11.1.tar.gz
Algorithm Hash digest
SHA256 631214181e904e75795885b40e6371d56acbd238ce24d3218b5007cb13be7d7b
MD5 8d12094305974a0e7c7c2e6ef0c0fa2c
BLAKE2b-256 49c612a34c856d4046fdd14f449e022892618b0c9cddecd0542378fcd450cc2e

See more details on using hashes here.

File details

Details for the file llm-0.11.1-py3-none-any.whl.

File metadata

  • Download URL: llm-0.11.1-py3-none-any.whl
  • Upload date:
  • Size: 36.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for llm-0.11.1-py3-none-any.whl
Algorithm Hash digest
SHA256 dbb662d0ae42f87ade8b01b09eddeac3b4f6d157bcdb3774b7521db6c9e2b0b5
MD5 389aacd73ca4abea633012a9f2b8144e
BLAKE2b-256 a8ad83a3c2d3b24fbe2b0fb89b7be56d27dd7a02bbef177b457f359c19a2a2ba

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page