Skip to main content

A flexible interface for working with various LLM providers

Project description

LLM Interface

A flexible Python interface for working with various Language Model providers, including OpenAI, Anthropic, and Ollama. This library provides a unified way to interact with different LLM providers while supporting features like structured outputs, tool execution, and response caching.

Features

  • Multiple Provider Support

    • OpenAI (GPT models)
    • Anthropic (Claude models)
    • Ollama (local and remote)
    • Remote Ollama via SSH
  • Advanced Capabilities

    • Structured output parsing with Pydantic models
    • Function/tool calling support
    • Response caching
    • Comprehensive logging
    • JSON mode support
    • System prompt handling
  • Developer-Friendly

    • Type hints throughout
    • Extensive test coverage
    • Flexible configuration options
    • Error handling and retries

Installation

Install using pip:

pip install llm-interface

Or using Poetry:

poetry add llm-interface

Basic Usage

Simple Chat Completion

from llm_interface import llm_from_config

# Create an OpenAI interface
llm = llm_from_config(
    provider="openai",
    model_name="gpt-4",
)

# Simple chat
response = llm.chat([
    {"role": "user", "content": "What is the capital of France?"}
])

Structured Output with Pydantic

from pydantic import BaseModel

class LocationInfo(BaseModel):
    city: str
    country: str
    population: int

response = llm.generate_pydantic(
    prompt_template="Provide information about Paris",
    output_schema=LocationInfo,
    system="You are a helpful geography assistant"
)

Tool/Function Calling

from llm_interface.llm_tool import tool

@tool(name="get_weather")
def get_weather(location: str, units: str = "celsius") -> str:
    """Get weather information for a location.
    
    Args:
        location: City or location name
        units: Temperature units (celsius/fahrenheit)
    """
    # Implementation here
    return f"Weather in {location}"

response = llm.chat(
    messages=[{"role": "user", "content": "What's the weather in Paris?"}],
    tools=[get_weather]
)

Remote Ollama Setup

llm = llm_from_config(
    provider="remote_ollama",
    model_name="llama2",
    hostname="example.com",
    username="user"
)

Configuration

The library supports various configuration options through the llm_from_config function:

llm = llm_from_config(
    provider="openai",          # "openai", "anthropic", "ollama", or "remote_ollama"
    model_name="gpt-4",        # Model name
    max_tokens=4096,           # Maximum tokens in response
    host=None,                 # Local Ollama host
    hostname=None,             # Remote SSH hostname
    username=None,             # Remote SSH username
    log_dir="logs",           # Directory for logs
    use_cache=True            # Enable response caching
)

Environment Variables

Required environment variables based on provider:

  • OpenAI: OPENAI_API_KEY
  • Anthropic: ANTHROPIC_API_KEY
  • Remote Ollama: SSH key loaded in SSH agent

Development

This project uses Poetry for dependency management:

# Install dependencies
poetry install

# Run tests
poetry run pytest

# Format code
poetry run black .

# Run linter
poetry run flake8

License

Apache License 2.0 - See LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_interface-0.1.0.tar.gz (22.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_interface-0.1.0-py3-none-any.whl (28.5 kB view details)

Uploaded Python 3

File details

Details for the file llm_interface-0.1.0.tar.gz.

File metadata

  • Download URL: llm_interface-0.1.0.tar.gz
  • Upload date:
  • Size: 22.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.8 Darwin/24.1.0

File hashes

Hashes for llm_interface-0.1.0.tar.gz
Algorithm Hash digest
SHA256 3ad063169c818d305f5695c72c975684261fd0ca9056208fe7f6a36b72858de9
MD5 6486f7cdab976c8862de9a770d162dcc
BLAKE2b-256 10ac89ae32da52960a7ff3a1999501b7bb0ff321d0ed4e7e2d24f7a0766bba2f

See more details on using hashes here.

File details

Details for the file llm_interface-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: llm_interface-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 28.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.8 Darwin/24.1.0

File hashes

Hashes for llm_interface-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a34b326189c001724cf2f3044b2088c559827e3a5a0cbff1e313de8bbc823eac
MD5 ae0ab1b69e96543742efcdaa8a9f14b3
BLAKE2b-256 d8f513c941013aa49fa85d182b3b39062ea7b41a7703ff66db0f6b46bc9e3276

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page