Unified Python bridge for integrating AI providers and custom AI functions.
Project description
ai-bridge-kit
ai-bridge-kit is a Python library to integrate AI providers through one clean API.
You can:
- Switch providers without rewriting app logic.
- Register your own local AI functions.
- Use retries and timeouts consistently.
- Keep your invention logic inside a reusable, package-ready SDK.
Install
pip install -e .
With OpenAI support:
pip install -e ".[openai]"
With Anthropic support:
pip install -e ".[anthropic]"
Install all provider extras:
pip install -e ".[all]"
For development:
pip install -e ".[dev,all]"
Quick Start
from ai_bridge_kit import AIClient
client = AIClient()
# Uses built-in local provider by default.
chat = client.chat("Explain AI integration in one line.")
print(chat.content)
emb = client.embed(["hello world"])
print(len(emb.vectors[0]))
Register your own AI functions
from ai_bridge_kit import AIClient
from ai_bridge_kit.providers import LocalFunctionProvider
provider = LocalFunctionProvider(name="my-ai")
provider.register("chat", lambda payload: "Custom answer")
provider.set_chat_function("chat")
client = AIClient()
client.register_provider(provider, set_default=True)
print(client.chat("hi").content)
OpenAI Provider (optional)
import os
from ai_bridge_kit import AIClient
from ai_bridge_kit.providers import OpenAIProvider
client = AIClient()
client.register_provider(
OpenAIProvider(api_key=os.environ["OPENAI_API_KEY"]),
set_default=True,
)
resp = client.chat("Give 3 startup names for an AI integration SDK.", model="gpt-4o-mini")
print(resp.content)
Additional Provider Adapters
Anthropic
import os
from ai_bridge_kit import AIClient
from ai_bridge_kit.providers import AnthropicProvider
client = AIClient()
client.register_provider(
AnthropicProvider(api_key=os.environ["ANTHROPIC_API_KEY"]),
set_default=True,
)
print(client.chat("Summarize agentic AI in one sentence.").content)
Ollama (local)
from ai_bridge_kit import AIClient
from ai_bridge_kit.providers import OllamaProvider
client = AIClient()
client.register_provider(
OllamaProvider(base_url="http://localhost:11434", default_chat_model="llama3.2"),
set_default=True,
)
print(client.chat("Explain RAG briefly.").content)
OpenAI-compatible APIs (OpenRouter/Groq/Together)
import os
from ai_bridge_kit import AIClient
from ai_bridge_kit.providers import OpenAICompatibleProvider
client = AIClient()
client.register_provider(
OpenAICompatibleProvider.for_openrouter(api_key=os.environ["OPENROUTER_API_KEY"]),
set_default=True,
)
print(client.chat("Give 3 names for an AI bridge SDK.").content)
CLI
ai-bridge providers
ai-bridge chat --message "What is retrieval augmented generation?"
ai-bridge embed --text "ai" --text "python"
ai-bridge call --function echo --arguments "{}"
Run tests
python -m pytest
Publish to PyPI
Use RELEASE.md for the full build + twine process.
Patent workflow support
Use PATENT_DISCLOSURE_TEMPLATE.md to document your technical novelty before filing.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ai_bridge_kit-0.1.1.tar.gz.
File metadata
- Download URL: ai_bridge_kit-0.1.1.tar.gz
- Upload date:
- Size: 15.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
00232c27de690983596bad0f2b27a1cb62bb65a58e38b1e325d72904a49e5521
|
|
| MD5 |
5b34202b480366e49bc5074ce040e5a1
|
|
| BLAKE2b-256 |
f9aa38cc23d9d74a0fd7429e24a09c9481c0eb575abdf64c880d1a28a37a2251
|
File details
Details for the file ai_bridge_kit-0.1.1-py3-none-any.whl.
File metadata
- Download URL: ai_bridge_kit-0.1.1-py3-none-any.whl
- Upload date:
- Size: 19.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e4120a3dd9b902dbcfd41e154eacd5bc03fbac18a833547c6f433abc832edd51
|
|
| MD5 |
40090d9e953f2ab8368147e9c258d182
|
|
| BLAKE2b-256 |
c327b58898dfa73bef71e625b7df6439fe8d9b6c3d3b23e169c28472c7b7f6a7
|