Skip to main content

Unified LLM API provider and engine library

Project description

LLM Engine

Unified LLM API provider and engine library for Python.

Features

  • Support for multiple LLM providers (OpenAI, DeepSeek, Ollama, Custom)
  • Both synchronous and asynchronous API support
  • Unified configuration via providers.yml
  • Environment variable resolution
  • Automatic retry logic
  • Streaming support
  • Token estimation and resource management

Installation

Install from PyPI:

pip install llm-engine

For local development:

cd llm-engine
pip install -e .

Or install with development dependencies:

pip install -e ".[dev]"

Configuration

Create a providers.yml file:

providers:
  deepseek:
    base_url: "https://api.deepseek.com/v1"
    api_key: ${DEEPSEEK_API_KEY}
    default_model: "deepseek-chat"
    models:
      - name: "deepseek-chat"
        context_length: 128000
        functions:
          json_output: true

Usage

Async Usage

from llm_engine import LLMConfig, LLMProvider, LLMEngine

config = LLMConfig(
    provider=LLMProvider.DEEPSEEK,
    model_name="deepseek-chat",
    api_key="your-api-key",
)

engine = LLMEngine(config)
response = await engine.generate("Hello, world!")

Sync Usage

from llm_engine import LLMConfig, LLMProvider
from llm_engine.providers.openai_compatible import OpenAICompatibleProvider

config = LLMConfig(
    provider=LLMProvider.DEEPSEEK,
    model_name="deepseek-chat",
    api_key="your-api-key",
)

provider = OpenAICompatibleProvider(config)
response = provider.call("Hello, world!")

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_api_engine-0.1.1.tar.gz (15.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_api_engine-0.1.1-py3-none-any.whl (19.0 kB view details)

Uploaded Python 3

File details

Details for the file llm_api_engine-0.1.1.tar.gz.

File metadata

  • Download URL: llm_api_engine-0.1.1.tar.gz
  • Upload date:
  • Size: 15.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.11

File hashes

Hashes for llm_api_engine-0.1.1.tar.gz
Algorithm Hash digest
SHA256 f06dfa859c8543ce810daa2fcb623e112f85c869f46a92f4dc32c2103bd7edcd
MD5 301bb1cf1b0dd481808ae535b2636420
BLAKE2b-256 fd7f201d187c23198fb92c6fafa517ef828677cbafa7cd871bcb3b7774bd13d5

See more details on using hashes here.

File details

Details for the file llm_api_engine-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: llm_api_engine-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 19.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.11

File hashes

Hashes for llm_api_engine-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 95d27c03d557c2960e6d5ef21ef3217cf2b148f4c47ea5b8c111b103b33c5a1c
MD5 eb4faf72347f522b2b95ed04f3f6cc66
BLAKE2b-256 f4b3621ac9daeeb9d5ead201ca6229340f8c315bfefda16a164127326cebdd4e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page