A flexible interface for working with various LLM providers
Project description
LLM Interface
A flexible Python interface for working with various Language Model providers, including OpenAI, Anthropic, and Ollama. This library provides a unified way to interact with different LLM providers while supporting features like structured outputs, tool execution, and response caching.
Features
-
Multiple Provider Support
- OpenAI (GPT models)
- Anthropic (Claude models)
- Ollama (local and remote)
- Remote Ollama via SSH
-
Advanced Capabilities
- Structured output parsing with Pydantic models
- Function/tool calling support
- Response caching
- Comprehensive logging
- JSON mode support
- System prompt handling
-
Developer-Friendly
- Type hints throughout
- Extensive test coverage
- Flexible configuration options
- Error handling and retries
Installation
Install using pip:
pip install llm-interface
Or using Poetry:
poetry add llm-interface
Basic Usage
Simple Chat Completion
from llm_interface import llm_from_config
# Create an OpenAI interface
llm = llm_from_config(
provider="openai",
model_name="gpt-4",
)
# Simple chat
response = llm.chat([
{"role": "user", "content": "What is the capital of France?"}
])
Structured Output with Pydantic
from pydantic import BaseModel
class LocationInfo(BaseModel):
city: str
country: str
population: int
response = llm.generate_pydantic(
prompt_template="Provide information about Paris",
output_schema=LocationInfo,
system="You are a helpful geography assistant"
)
Tool/Function Calling
from llm_interface.llm_tool import tool
@tool(name="get_weather")
def get_weather(location: str, units: str = "celsius") -> str:
"""Get weather information for a location.
Args:
location: City or location name
units: Temperature units (celsius/fahrenheit)
"""
# Implementation here
return f"Weather in {location}"
response = llm.chat(
messages=[{"role": "user", "content": "What's the weather in Paris?"}],
tools=[get_weather]
)
Remote Ollama Setup
llm = llm_from_config(
provider="remote_ollama",
model_name="llama2",
hostname="example.com",
username="user"
)
Configuration
The library supports various configuration options through the llm_from_config function:
llm = llm_from_config(
provider="openai", # "openai", "anthropic", "ollama", or "remote_ollama"
model_name="gpt-4", # Model name
max_tokens=4096, # Maximum tokens in response
host=None, # Local Ollama host
hostname=None, # Remote SSH hostname
username=None, # Remote SSH username
log_dir="logs", # Directory for logs
use_cache=True # Enable response caching
)
Environment Variables
Required environment variables based on provider:
- OpenAI:
OPENAI_API_KEY - Anthropic:
ANTHROPIC_API_KEY - Remote Ollama: SSH key loaded in SSH agent
Development
This project uses Poetry for dependency management:
# Install dependencies
poetry install
# Run tests
poetry run pytest
# Format code
poetry run black .
# Run linter
poetry run flake8
License
Apache License 2.0 - See LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_interface-0.1.3.tar.gz.
File metadata
- Download URL: llm_interface-0.1.3.tar.gz
- Upload date:
- Size: 23.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.8 Darwin/24.1.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6ede74f98d2e8a40fdae2ccd2d159f5fc54056366a52ba358c5c29c9f17378c4
|
|
| MD5 |
e1ae1bde116465c29f22f04fd80b2722
|
|
| BLAKE2b-256 |
5f7d36ecb67a245d91a1b29a8c7db7276a606d43224abb1c06295757c251a97b
|
File details
Details for the file llm_interface-0.1.3-py3-none-any.whl.
File metadata
- Download URL: llm_interface-0.1.3-py3-none-any.whl
- Upload date:
- Size: 28.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.8 Darwin/24.1.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ed04ffec9c4b31e37be9e3052db3547190f77e9f621f77dc6c0ee074c21e0096
|
|
| MD5 |
0d6f0584d685420a1be2c26b24f500e2
|
|
| BLAKE2b-256 |
861b511de45928ec3359bd2e01e6a67bf929cac832aae1f5ba7aa6e044778f80
|