A Python library for abstracting LLM interactions
Project description
AbstractLLM
A lightweight, unified interface for interacting with multiple Large Language Model providers.
[THIS IS A WORK IN PROGRESS, STAY TUNED !]
Features
- 🔄 Unified API: Consistent interface for OpenAI, Anthropic, Ollama, and Hugging Face models
- 🔌 Provider Agnostic: Switch between providers with minimal code changes
- 🎛️ Configurable: Flexible configuration at initialization or per-request
- 📝 System Prompts: Standardized handling of system prompts across providers
- 📊 Capabilities Inspection: Query models for their capabilities
- 📝 Logging: Built-in request and response logging
Installation
# Basic installation
pip install abstractllm
# With provider-specific dependencies
pip install abstractllm[openai]
pip install abstractllm[anthropic]
pip install abstractllm[huggingface]
# All dependencies
pip install abstractllm[all]
Quick Start
from abstractllm import create_llm
# Create an LLM instance
llm = create_llm("openai", api_key="your-api-key")
# Generate a response
response = llm.generate("Explain quantum computing in simple terms.")
print(response)
Supported Providers
OpenAI
llm = create_llm("openai",
api_key="your-api-key",
model="gpt-4")
Anthropic
llm = create_llm("anthropic",
api_key="your-api-key",
model="claude-3-opus-20240229")
Ollama
llm = create_llm("ollama",
base_url="http://localhost:11434",
model="llama2")
Hugging Face
llm = create_llm("huggingface",
model="google/gemma-7b")
Configuration
You can configure the LLM's behavior in several ways:
# At initialization
llm = create_llm("openai", temperature=0.7, system_prompt="You are a helpful assistant.")
# Update later
llm.set_config({"temperature": 0.5})
# Per-request
response = llm.generate("Hello", temperature=0.9)
System Prompts
System prompts help shape the model's personality and behavior:
llm = create_llm("openai", system_prompt="You are a helpful scientific assistant.")
# Or for a specific request
response = llm.generate("What is quantum entanglement?",
system_prompt="You are a physics professor explaining to a high school student.")
Capabilities
Check what capabilities a provider supports:
capabilities = llm.get_capabilities()
print(capabilities)
# Example: {'streaming': True, 'max_tokens': 4096, 'supports_system_prompt': True}
Logging
AbstractLLM includes built-in logging:
import logging
from abstractllm.utils.logging import setup_logging
# Set up logging with desired level
setup_logging(level=logging.DEBUG)
Advanced Usage
See the Usage Guide for advanced usage patterns, including:
- Using multiple providers
- Implementing fallback chains
- Error handling
- And more
Contributing
Contributions are welcome! Read more about how to contribute in the CONTRIBUTING file. Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file abstractllm-0.1.0.tar.gz.
File metadata
- Download URL: abstractllm-0.1.0.tar.gz
- Upload date:
- Size: 17.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8b8865111824350564dabac577e35f9d1f090a73feed32716af67d1ff93c7fda
|
|
| MD5 |
1b5e5fd2b4c74d7716853bed81803c9c
|
|
| BLAKE2b-256 |
bb339ef835db917fe25776cca4c41ed7c0d7377b0851a16c078c6f85c051758f
|
File details
Details for the file abstractllm-0.1.0-py3-none-any.whl.
File metadata
- Download URL: abstractllm-0.1.0-py3-none-any.whl
- Upload date:
- Size: 3.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d2c22540c4caac33b03951b75318027eda3a2283840885ed98bc2cf39c28fec0
|
|
| MD5 |
bdef520eac81e46c12a8b09b2b15c369
|
|
| BLAKE2b-256 |
9fc1db42f3c4c93892d41c1e33ec268de0db66d44e3e53fe7a3fe29d4652eaf2
|