An open-source modular AI agent framework. Plug in IRC, Discord, Telegram, web — power it with Claude or any LLM.
Project description
turborg
An open-source modular AI agent framework. Plug in IRC, Discord, Telegram, web — power it with Claude or any LLM.
from turborg.core import Agent, OutboundEnvelope
from turborg.connectors.irc import IRCConnector, IRCSettings
from turborg.llm.anthropic import AnthropicProvider
agent = Agent(llm=AnthropicProvider(api_key="sk-ant-..."))
agent.add_connector(IRCConnector(IRCSettings(hostname="irc.libera.chat", nick="myturborg")))
@agent.on_command("ask")
async def ask(envelope):
answer = await agent.llm.ask(" ".join(envelope.args))
return OutboundEnvelope.reply(envelope, answer)
agent.run()
That's a working IRC chatbot powered by Claude. Same code shape ports to any future connector — Discord, Telegram, web, custom.
What is turborg?
turborg is a Python framework for writing AI agents that connect to chat networks. It separates how the bot talks to a network (the connector) from how it thinks (the LLM provider) from what it does (your handlers). Add a Discord connector and the same handlers work on Discord. Swap from Claude to OpenAI and the same connectors keep working.
The architecture lives at three layers:
Your handlers (commands, event hooks)
│
▼
┌──────────┐ ┌──────────────┐
│ Agent │ ←──── │ LLM provider │ Anthropic / OpenAI / custom
└──────────┘ └──────────────┘
│
▼
┌────────────────┐
│ Connectors │ IRC / Discord / web / Telegram / ...
└────────────────┘
A normalized Envelope is the lingua franca: the agent never sees IRC vs Discord, only connectors translate.
For the long-term vision — hive.xshellz.com, a shared-intelligence cloud any turborg instance can attach to — see docs/hive.md.
Status
v0.1.0 — alpha. The IRC connector and Anthropic provider are production-ready and well-tested (≥90% coverage gate enforced in CI). Other connectors are roadmap.
| Connector | Status | Install |
|---|---|---|
| IRC | Stable | pip install turborg |
| Discord | Roadmap | pip install turborg[discord] (TBD) |
| Telegram | Roadmap | pip install turborg[telegram] (TBD) |
| Roadmap | pip install turborg[whatsapp] (TBD) |
|
| Web | v0.2 hook | pip install turborg[web] |
| LLM provider | Status | Install |
|---|---|---|
| Anthropic | Default | pip install turborg |
| OpenAI | Available | pip install turborg[openai] |
Install
pip install turborg
Or with uv:
uv add turborg
For a working bot you also need an LLM API key:
export ANTHROPIC_API_KEY=sk-ant-...
See docs/configuration.md for the full list of environment variables.
Quickstart
The full 5-minute tutorial is at docs/quickstart.md. The 30-second version:
pip install turborg
export ANTHROPIC_API_KEY=sk-ant-...
export TURBORG_IRC_HOSTNAME=irc.libera.chat
export TURBORG_IRC_NICK=myturborg
export TURBORG_IRC_CHANNELS='["#turborg-test"]'
python examples/claude_powered_irc.py
Then in IRC:
<you> !ask what is a quine?
<myturborg> A quine is a program that prints its own source code...
Run with Docker
cp .env.example .env # fill in ANTHROPIC_API_KEY + TURBORG_IRC_*
docker compose up
Or without compose:
docker run --env-file .env turborg/turborg:latest
To run a different example or your own bot, override command: in
docker-compose.yml or pass python /app/examples/minimal_irc_bot.py on
docker run. Mount your own script with -v "$PWD/mybot.py:/app/mybot.py".
The image is multi-stage, ~150 MB on disk, and runs as a non-root user.
Documentation
- Quickstart — get a bot online in 5 minutes
- Architecture — how the agent, connectors, and LLM fit together
- Configuration — every setting and environment variable
- LLM providers — default Anthropic, swapping providers
- Writing a connector — add Discord, Telegram, your own
- Hive — the future shared-intelligence cloud
Examples
examples/minimal_irc_bot.py— the smallest possible bot (!ping→pong)examples/claude_powered_irc.py—!ask <question>proxies to Claude
Project layout
turborg/
├── src/turborg/
│ ├── core/ Agent, envelope, event bus, command registry
│ ├── connectors/ Connector ABC + per-protocol implementations
│ │ └── irc/ IRC connector (handshake, parser, bouncer)
│ ├── llm/ LLM provider ABC + Anthropic implementation
│ ├── hive/ Hive client extension hook (noop default)
│ ├── api/ v0.2+ control-plane HTTP API (placeholder)
│ ├── config/ pydantic-settings config
│ └── cli.py turborg CLI entry point
├── tests/ unit + integration suites
├── examples/ runnable example bots
└── docs/ full documentation
Contributing
Contributions are welcome. See CONTRIBUTING.md for the dev setup, branching strategy, and PR rules. By submitting a contribution you agree to the Contributor License Agreement — the cla-assistant bot will guide you on first PR.
The maintainers run a strict CI gate: every PR must pass ruff, mypy --strict, and tests with ≥90% coverage. See the Style section below.
License
Apache License 2.0 — see TRADEMARKS.md for the trademark policy on the names "turborg" and "xshellz".
Security
Found a vulnerability? Please do not open a public issue. See SECURITY.md for the responsible-disclosure process.
Style
- Conventional Commits for PR titles (
feat:,fix:,docs:,refactor:,chore:) - Squash-merge to
main; linear history - 100-char line limit, ruff-formatted, mypy-strict
- pytest with branch coverage; fail under 90%
- No
Co-Authored-By: AItrailers in commit messages
Powered by
- Anthropic Claude — default LLM
- pydantic — settings and envelope validation
- typer — CLI
- hatchling — build backend
- uv — env management
Part of the xshellz ecosystem. The future hosted hive lives at hive.xshellz.com.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file turborg-0.1.0-py3-none-any.whl.
File metadata
- Download URL: turborg-0.1.0-py3-none-any.whl
- Upload date:
- Size: 38.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
29122d2878474577c83487f86dd4582e11b6e27e9592d5b0dd37ff39d4751179
|
|
| MD5 |
15fc2241ad791bd35d469da76c014a8d
|
|
| BLAKE2b-256 |
e4439911e0fc6532661a1d4232c9641e5b8117653b56df5eb4fe0c306f12c555
|
Provenance
The following attestation bundles were made for turborg-0.1.0-py3-none-any.whl:
Publisher:
release.yml on turborg/turborg
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
turborg-0.1.0-py3-none-any.whl -
Subject digest:
29122d2878474577c83487f86dd4582e11b6e27e9592d5b0dd37ff39d4751179 - Sigstore transparency entry: 1454181340
- Sigstore integration time:
-
Permalink:
turborg/turborg@bf3d0a0be56379f4d478812b7adf4ed6a6667265 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/turborg
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@bf3d0a0be56379f4d478812b7adf4ed6a6667265 -
Trigger Event:
push
-
Statement type: