Skip to main content

An LLM wrapper for Humans

Project description

python-llm: A LLM API for Humans

Tests

The simplicity and elegance of python-requests, but for LLMs. This library supports models from OpenAI and Anthropic. I will try to add more when I have the time, and am warmly accepting pull requests if that's of interest.

Usage

import llm
llm.set_api_key(openai="sk-...", anthropic="sk-...")

# Chat
llm.chat("what is 2+2") # 4. Uses GPT-3 by default if key is provided.
llm.chat("what is 2+2", engine="anthropic:claude-instant-v1") # 4.

# Completion
llm.complete("hello, I am") # A GPT model.
llm.complete("hello, I am", engine="openai:gpt-4") # A big GPT model.
llm.complete("hello, I am ", engine="anthropic:claude-instant-v1") # Claude.

# Back-and-forth chat [human, assistant, human]
llm.chat(["hi", "hi there, how are you?", "good, tell me a joke"]) # Why did chicken cross road?

# Streaming chat
llm.stream_chat(["what is 2+2"]) # 4. 
llm.multi_stream_chat(["what is 2+2"], 
                      engines=
                      ["anthropic:claude-instant-v1", 
                      "openai:gpt-3.5-turbo"]) 
# Results will stream back to you from both models at the same time like this:
# ["anthropic:claude-instant-v1", "hi"], ["openai:gpt-3.5-turbo", "howdy"], 
# ["anthropic:claude-instant-v1", " there"] ["openai:gpt-3.5-turbo", " my friend"]

# Engines are in the provider:model format, as in openai:gpt-4, or anthropic:claude-instant-v1.

Multi Stream Chat In Action

Given this feature is very lively, I've included a video of it in action.

https://github.com/danielgross/python-llm/assets/279531/d68eb843-7a32-4ffe-8ac2-b06b81e764b0

Installation

To install python-llm, use pip: pip install python-llm.

Configuration

You can set API keys in a few ways:

  1. Through environment variables (you can also set a .env file).
export OPENAI_API_KEY=sk_...
export ANTHROPIC_API_KEY=sk_...
  1. By calling the method manually:
import llm
llm.set_api_key(openai="sk-...", anthropic="sk-...")
  1. By passing a JSON file like this:
llm.set_api_key("path/to/api_keys.json")

The JSON should look like:

{
  "openai": "sk-...",
  "anthropic": "sk-..."
}

TODO

  • Caching!
  • More LLM vendors!
  • More tests!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python-llm-0.2.0.tar.gz (11.3 kB view details)

Uploaded Source

File details

Details for the file python-llm-0.2.0.tar.gz.

File metadata

  • Download URL: python-llm-0.2.0.tar.gz
  • Upload date:
  • Size: 11.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.9

File hashes

Hashes for python-llm-0.2.0.tar.gz
Algorithm Hash digest
SHA256 bbc8cd3277b1d71ec02f490e712de80052429ae8319af1d2410792912d25f8fe
MD5 0b7b301cef23281dccba5674aee57016
BLAKE2b-256 24f00381c058f196fc5a9b2dd3f8bba4f90f984385cf02f4e54affb3135c7f14

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page