An Async-first tool calling & LLM client orchestration. OpenAI-compatible, cacheable, framework-agnostic.
Project description
🧙♂️ Kebogyro
An async-first, Javanese-coded LLM orchestration toolkit — made for geeks who speak tools.
kebogyro is a Python package born from the union of 🛠️ Gyro Gearloose’s mechanical genius and 🎶 Kebogiro, the Javanese gamelan suite played in moments of ceremony, transition, and wonder.
Like its namesake, kebogyro is all about orchestrating — not music, but function calls, LLM responses, and microservice bridges. It's fast, extendable, and knows its way around a toolbench.
🎡 What's in a Name?
"Gyro" = Gearloose-style inventiveness and tool-wrangling "Kebo" = Javanese for buffalo, symbolizing calm power, persistence, and resilience — qualities you want in long-running AI services ⚙️💡
Together: Kebogyro — a humble-yet-geeky orchestration layer that never breaks under load, and plays well with others.
🧬 Why Use kebogyro?
Because building LLM apps should be fast, flexible, and async-native:
- 🔁 Multi-Provider Orchestration — OpenAI, OpenRouter, Groq, Mistral, and more, with one clean interface.
- 🪛 Tool Calling — Let your model invoke Python functions directly.
- 🧠 Cache Smart — Redis, Django ORM, or memory-backed tool caching.
- 🔌 Zero Framework Lock-In — Integrate with anything: FastAPI, Flask, raw ASGI, or Celery.
- 🛠 Hackable by Design — Extend it, override it, plug it in where you want.
🧠 Core Concepts
Kebogyro is built around three key components:
🧵 LLMClientWrapper
Wraps any supported LLM provider (OpenAI, OpenRouter, Groq, etc.) with support for:
- Tool-calling via
additional_tools - System prompts
- Result caching
- Optional
mcp_clientintegration
llm = LLMClientWrapper(
provider="openrouter",
model_name="mistralai/mistral-7b-instruct",
model_info={"api_key": os.getenv("OPENROUTER_API_KEY")},
system_prompt="You are a helpful AI assistant.",
mcp_client=mcp_client, # Optional
additional_tools=[my_tool], # Optional
llm_cache=MyLLMCacheAdapter(), # Optional
)
Works seamlessly with or without tools.
🛠 BBServerMCPClient
Adapter for remote tool orchestration using the MCP protocol. Supports multiple transports:
stdiossestreamableHTTP
mcp_client = BBServerMCPClient(
connections={
# Keys are namespaces (server names). You can define multiple MCP servers.
# Each one is scoped separately in cache.
"workroom_tools": {
"url": "http://localhost:5000/sse",
"transport": "sse",
},
"finance_tools": {
"url": "http://localhost:5100/sse",
"transport": "sse",
}
},
cache_adapter=MyLLMCacheAdapter()
)
Namespaces isolate tools per server and reflect in cache keys.
🎭 create_agent
Creates an orchestration pipeline that connects:
- LLM
- Optional tools
- Optional MCP tools
from kebogyro.agent_executor import create_agent
agent = create_agent(
llm_client=llm,
tools=[my_tool], # Optional standard tools
mcp_tools=mcp_client, # Optional BBServerMCPClient
system_prompt="You're a coding agent.",
stream=True
)
response = await agent.ainvoke({"input": "What's the weather like in Yogyakarta?"})
You can mix & match standard tools and MCP tools — or use none at all.
⚡️ Full Example: Caching + Custom Tools
from kebogyro.wrapper import LLMClientWrapper
from kebogyro.mcp_adapter.client import BBServerMCPClient
from kebogyro.agent_executor import create_agent
from kebogyro.cache import AbstractLLMCache
from kebogyro.utils import SimpleTool
import os
class MyLLMCacheAdapter(AbstractLLMCache):
async def aget_value(self, key): ...
async def aset_value(self, key, value, expiry_seconds): ...
async def adelete_value(self, key): ...
async def is_expired(self, key): ...
def say_hello(name: str) -> str:
return f"Hello, {name}!"
hello_tool = SimpleTool.from_fn(
name="say_hello",
description="Greets the user.",
fn=say_hello
)
mcp_client = BBServerMCPClient(
connections={
"tools": {
"url": "http://localhost:5000/sse",
"transport": "sse"
}
},
cache_adapter=MyLLMCacheAdapter()
)
llm = LLMClientWrapper(
provider="openrouter",
model_name="mistralai/mistral-7b-instruct",
model_info={"api_key": os.getenv("OPENROUTER_API_KEY")},
mcp_client=mcp_client,
additional_tools=[hello_tool],
llm_cache=MyLLMCacheAdapter()
)
agent = create_agent(
llm_client=llm,
tools=[hello_tool],
mcp_tools=mcp_client,
system_prompt="You're a cached, tool-capable agent.",
stream=False
)
response = await agent.ainvoke({"input": "Say hello to Lantip"})
🌐 LLM Provider Config
Edit config.py to register or override providers:
class LLMClientConfig(BaseModel):
base_urls: Dict[str, HttpUrl] = Field(
default_factory=lambda: {
"openrouter": "https://openrouter.ai/api/v1",
"anthropic": "https://api.anthropic.com/v1/",
"cerebras": "https://api.cerebras.ai/v1",
"groq": "https://api.groq.ai/openai/v1",
"requesty": "https://router.requesty.ai/v1"
}
)
google_default_base_url: HttpUrl = "https://generativelanguage.googleapis.com/v1beta/openai/"
Add custom:
config = LLMClientConfig()
config.base_urls["my_llm"] = "https://my-custom-llm.com/api"
📦 Install
pip install kebogyro
🙌 Contributing
Pull requests, bug reports, ideas, memes — all welcome.
📄 License
MIT — Use it, remix it, extend it. Let the gamelan guide your tools.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file kebogyro-0.1.1.tar.gz.
File metadata
- Download URL: kebogyro-0.1.1.tar.gz
- Upload date:
- Size: 22.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4a143dcf5665658e3cd3d6f497949ec8a17c22a7327ef8abdd1a12329da57247
|
|
| MD5 |
62401d7faefb3a89cc2a331f5eb49385
|
|
| BLAKE2b-256 |
8dcb59526f146d8631be82b57a0ac0b3c72db774360f3459de749227af2f31a4
|
File details
Details for the file kebogyro-0.1.1-py3-none-any.whl.
File metadata
- Download URL: kebogyro-0.1.1-py3-none-any.whl
- Upload date:
- Size: 21.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
73d7d3e3e043773289ee5cfc9c302518b0807c32bf24f81d36f69ca73a17a64a
|
|
| MD5 |
2dbe14259f813024aa560f5e9bc5519d
|
|
| BLAKE2b-256 |
b423582531740fb9c0bdac7f0d2ca7b771659d2e959ef61cad1997c9b9080d85
|