Unified LLM API provider and engine library
Project description
LLM Engine
Unified LLM API provider and engine library for Python.
Features
- Support for multiple LLM providers (OpenAI, DeepSeek, Ollama, Custom)
- Both synchronous and asynchronous API support
- Unified configuration via
providers.yml - Environment variable resolution
- Automatic retry logic
- Streaming support
- Token estimation and resource management
Installation
Install from PyPI:
pip install llm-engine
For local development:
cd llm-engine
pip install -e .
Or install with development dependencies:
pip install -e ".[dev]"
Configuration
Create a providers.yml file:
providers:
deepseek:
base_url: "https://api.deepseek.com/v1"
api_key: ${DEEPSEEK_API_KEY}
default_model: "deepseek-chat"
models:
- name: "deepseek-chat"
context_length: 128000
functions:
json_output: true
Usage
Async Usage
from llm_engine import LLMConfig, LLMProvider, LLMEngine
config = LLMConfig(
provider=LLMProvider.DEEPSEEK,
model_name="deepseek-chat",
api_key="your-api-key",
)
engine = LLMEngine(config)
response = await engine.generate("Hello, world!")
Sync Usage
from llm_engine import LLMConfig, LLMProvider
from llm_engine.providers.openai_compatible import OpenAICompatibleProvider
config = LLMConfig(
provider=LLMProvider.DEEPSEEK,
model_name="deepseek-chat",
api_key="your-api-key",
)
provider = OpenAICompatibleProvider(config)
response = provider.call("Hello, world!")
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm_api_engine-0.1.2.tar.gz
(15.4 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_api_engine-0.1.2.tar.gz.
File metadata
- Download URL: llm_api_engine-0.1.2.tar.gz
- Upload date:
- Size: 15.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d8adc8b007cdefeb09bcbec23a2929b6100dfbfa5eb60ea73a174ec58782d56f
|
|
| MD5 |
e20bfbe8a7a92b5a29b0d622bb590ce7
|
|
| BLAKE2b-256 |
8cced2dd259e791bf7b69c95c57c48bcd68f9402b254d31b46e77517d5ab44c0
|
File details
Details for the file llm_api_engine-0.1.2-py3-none-any.whl.
File metadata
- Download URL: llm_api_engine-0.1.2-py3-none-any.whl
- Upload date:
- Size: 19.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
40fad96e30e6e8dc2b57a13ca1df7826bd1f103c23ac632cb97205019d2009e4
|
|
| MD5 |
7a4cb8c235ad54ecaa8950cd5dbef9ea
|
|
| BLAKE2b-256 |
f30ee87da01f3a797f96dce6bec7afe34ec5cb30b15f3725b01b6d9b03dfea46
|