Skip to main content

A library that wraps multiple LLM providers into a consistent API while using each provider's native SDK internally, supporting multimodal I/O, file processing, and stream output.

Project description

LLM Bridge

LLM Bridge is a Python library that wraps multiple LLM providers into a consistent API while using each provider's native SDK internally, supporting multimodal I/O, file processing, and stream output.

GitHub: https://github.com/windsnow1025/LLM-Bridge

PyPI: https://pypi.org/project/LLM-Bridge/

Workflow and Features

  1. Message Preprocessor: extracts text content from documents (Word, Excel, PPT, Code files, PDFs) which are not natively supported by the target model.
  2. Chat Client Factory: creates a client for the specific LLM API with model parameters
    1. Model Message Converter: converts general messages to model messages
      1. Media Processor: converts general media (Image, Audio, Video, PDF) to model compatible formats.
  3. Chat Client: generate stream or non-stream responses
    • Model Thoughts: captures the model's thinking process
    • Code Execution: generates and executes Python code
    • Web Search: generates response from search results
    • Token Counter: tracks and reports input and output token usage

Supported Features for API Types

The features listed represent the maximum capabilities of each API type supported by LLM Bridge.

API Type Input Format Capabilities Output Format
OpenAI Text, Image, PDF Thinking, Web Search, Code Execution, Structured Output Text, Image
Gemini Text, Image, PDF, Audio, Video Thinking, Web Search, Code Execution, Structured Output Text, Image, File
Claude Text, Image, PDF Thinking, Web Search, Code Execution, Structured Output Text, File
Grok Text, Image Text

Planned Features

  • More features for API Types
  • Native support for Grok

Development

Python uv

  1. Install uv: powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
  2. Install Python in uv: uv python install 3.12; upgrade Python in uv: uv python upgrade 3.12
  3. Configure requirements:
uv sync --refresh

PyCharm

Add New Interpreter >> Add Local Interpreter

  • Environment: Select existing
  • Type: uv

Usage

Copy ./usage/.env.example and rename it to ./usage/.env, then fill in the environment variables.

Build

uv build

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_bridge-1.17.9.tar.gz (82.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_bridge-1.17.9-py3-none-any.whl (44.4 kB view details)

Uploaded Python 3

File details

Details for the file llm_bridge-1.17.9.tar.gz.

File metadata

  • Download URL: llm_bridge-1.17.9.tar.gz
  • Upload date:
  • Size: 82.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llm_bridge-1.17.9.tar.gz
Algorithm Hash digest
SHA256 1b514ffc1de53a7d23694cab4503c30e7ae286a0a345013880879bb38e0aea04
MD5 a2e43ec2b6fe92244dbd0536b0d61c51
BLAKE2b-256 f03dcaa8fba6b9f8952be09fec457a87810f30eade7dfb371e4deee23d855b4b

See more details on using hashes here.

File details

Details for the file llm_bridge-1.17.9-py3-none-any.whl.

File metadata

  • Download URL: llm_bridge-1.17.9-py3-none-any.whl
  • Upload date:
  • Size: 44.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llm_bridge-1.17.9-py3-none-any.whl
Algorithm Hash digest
SHA256 7d8f93e363e9dc461bfda58bd4eda2804498bfb442acdef84e47f8b18ada65f3
MD5 7256202cdc7ad2e07e6ebbbb3dcc418d
BLAKE2b-256 fde9854ea9b1da1e80da8a0b78e1f8f058eabc527bb1843a784b7c3dcddd9a80

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page