A streamlined, lightweight LLM client library for OpenRouter and Ollama providers with structured output generation
Project description
Simplified United LLM
A streamlined, lightweight LLM client library that provides unified access to OpenRouter and Ollama providers with structured output generation and comprehensive logging capabilities.
Features
- Unified Interface: Single client for both OpenRouter and Ollama providers
- Structured Output: Generate structured data using string-schema definitions
- Comprehensive Logging: Daily organized logs with detailed request/response tracking
- Simple Configuration: Dictionary-based configuration from JSON files
- Provider Auto-Detection: Automatic provider selection based on model prefixes
- Pydantic Integration: Built-in validation using Pydantic models
Installation
pip install simplified-united-llm
Dependencies
- Python 3.8+
- pydantic>=2.0.0
- string-schema>=0.1.0
- requests>=2.25.0
- openai>=1.0.0
Quick Start
1. Create Configuration File
Create a united_llm.json file with your API keys and settings:
{
"api_keys": {
"openrouter": "sk-or-v1-your-openrouter-key",
"ollama": null
},
"base_urls": {
"openrouter": "https://openrouter.ai/api/v1",
"ollama": "http://localhost:11434/v1"
},
"log_dir": "logs/llm_calls"
}
2. Initialize Client
import json
from united_llm import LLMClient
# Load configuration
with open('united_llm.json', 'r') as f:
config = json.load(f)
# Initialize client
client = LLMClient(config_dict=config)
3. Generate Structured Output
# Extract structured data using OpenRouter
result = client.generate_dict(
prompt="Extract info: John Doe, 30, from NYC, works as engineer",
schema="{name, age:int, city, job}",
model="openrouter:google/gemini-2.5-flash-lite"
)
print(result)
# Output: {"name": "John Doe", "age": 30, "city": "NYC", "job": "engineer"}
# Generate using local Ollama model
result = client.generate_dict(
prompt="Analyze sentiment: This product is amazing!",
schema="{sentiment, confidence:float}",
model="ollama:qwen2.5:3b"
)
print(result)
# Output: {"sentiment": "positive", "confidence": 0.95}
Schema Syntax
The library uses string-schema for defining structured output formats:
Basic Types
# Simple object
"{name, age:int, email}"
# With arrays
"{name, hobbies: [string]}"
# Nested objects
"{user: {name, email}, posts: [{title, content}]}"
# Different data types
"{score:float, active:bool, count:int}"
Array Schemas
# Array of objects
"[{name, score:float}]"
# Array of primitives
"[string]"
"[int]"
Supported Providers
OpenRouter
- Prefix:
openrouter: - Example:
openrouter:google/gemini-2.5-flash-lite - Requirements: API key required
- Models: All OpenRouter supported models
Ollama
- Prefix:
ollama: - Example:
ollama:qwen2.5:3b - Requirements: Local Ollama server running
- Models: Any model available in your Ollama installation
Configuration
Required Configuration Keys
api_keys: Dictionary with provider API keysbase_urls: Dictionary with provider base URLs
Optional Configuration Keys
log_dir: Directory for log files (default: "logs/llm_calls")
Example Configuration
{
"api_keys": {
"openrouter": "sk-or-v1-...",
"ollama": null
},
"base_urls": {
"openrouter": "https://openrouter.ai/api/v1",
"ollama": "http://localhost:11434/v1"
},
"log_dir": "logs/llm_calls"
}
Logging
The library automatically logs all requests and responses to daily log files:
logs/llm_calls/2025-01-23.log
Log format:
2025-01-23 10:30:15 | openrouter:google/gemini-2.5-flash-lite | openrouter | Extract info: John... | {name, age:int, city} | {"name": "John Doe", "age": 30, "city": "NYC"} | 1.25s
Testing
Run the included test program:
python examples/test_models.py
This will test both OpenRouter and Ollama providers with sample prompts.
Error Handling
The library provides clear error messages for common issues:
- Configuration errors: Invalid or missing configuration keys
- Provider errors: API failures, network issues, authentication problems
- Schema errors: Invalid string-schema definitions
- Model errors: Unsupported models or provider prefixes
Development
Project Structure
simplified-united-llm/
├── setup.py
├── requirements.txt
├── README.md
├── united_llm/
│ ├── __init__.py
│ ├── client.py
│ ├── providers/
│ │ ├── __init__.py
│ │ ├── openrouter.py
│ │ └── ollama.py
│ └── utils/
│ ├── __init__.py
│ ├── logging.py
│ └── schema_parser.py
├── tests/
│ ├── __init__.py
│ └── test_client.py
├── examples/
│ └── test_models.py
└── united_llm.json
Running Tests
pip install pytest
pytest tests/
License
MIT License - see LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file simplified_united_llm-0.1.4.tar.gz.
File metadata
- Download URL: simplified_united_llm-0.1.4.tar.gz
- Upload date:
- Size: 20.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
50b7e78a0972cee1ac22f0597a7c836e5f7e6e3d7dcbf71e8c031dbcbf126bde
|
|
| MD5 |
536b9e77888e43fb56a5e4b6521bbdae
|
|
| BLAKE2b-256 |
1cdb9b98f81dd444ef850e8d3e651a490135d63b7ba388b5f734e4a11166156c
|
File details
Details for the file simplified_united_llm-0.1.4-py3-none-any.whl.
File metadata
- Download URL: simplified_united_llm-0.1.4-py3-none-any.whl
- Upload date:
- Size: 23.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ee1f0d95a6c531d3893f3ce0de3b2e842e0814fc8c39800251572242fa0f1012
|
|
| MD5 |
c0b671771cf1513eb2fe52ed1b1358e5
|
|
| BLAKE2b-256 |
11cae0b3ba331b64235cb3f49c9bef4098facac8d3a7b88d1f3668c851ccd945
|