Skip to main content

A terminal AI chat where everything has hooks

Project description

Kollabor

Python Version PyPI License: MIT

terminal AI where agents talk to each other, organize themselves, and keep working when you walk away.

i built this because every AI tool i tried was a single chat window. i wanted agents that actually collaborate -- one reads the codebase, another writes code, a third reviews it, and they coordinate without me micromanaging every step. so i built that.

three agents. you give one a task. it breaks it down, delegates to the others, they execute, report back, and you get the result. they remember things across sessions. they get smarter over time. and if you're not at your computer, they message you on telegram and you can talk back.

brew install kollaborai/tap/kollabor   # macOS
kollab                                  # start chatting
kollab --agent jarvis                   # launch with an agent persona
kollab --hub status                     # see who's online from the shell
kollab --hub msg jarvis "fix the auth"  # talk to agents without opening the app
kollab --hub kill lapis                 # shut down an agent remotely

what makes this different

  • agents that coordinate -- launch 3 agents, give one a task, watch it delegate to the others and report back
  • persistent memory -- agents remember what they learned across sessions. they dream when idle and crystallize knowledge
  • open channel -- every agent sees every message. like a team slack, not isolated chat windows
  • hook everything -- every stage of the pipeline is interceptable. user input, API calls, responses, tool use, rendering
  • any LLM -- OpenAI, Anthropic, Google, Azure, OpenRouter, Ollama, or any OpenAI-compatible endpoint
  • telegram bridge -- talk to your agents from your phone. send voice notes, they transcribe locally with whisper
  • plugin system -- drop a Python file in plugins/, it loads automatically
  • no tmux -- pure subprocess. no external dependencies for agent management
  • pipe mode -- echo "query" | kollab -p for scripting

Install

brew install kollaborai/tap/kollabor        # macOS (recommended)
curl -sS https://raw.githubusercontent.com/kollaborai/kollabor-cli/main/install.sh | bash  # cross-platform

Or manually: uv tool install kollab / pipx install kollab / pip install kollab

Quick Start

Kollabor auto-detects your API keys from standard environment variables:

Environment Variable Provider Notes
ANTHROPIC_API_KEY Anthropic Claude models
OPENAI_API_KEY OpenAI GPT models
GEMINI_API_KEY Google Gemini models
OPENROUTER_API_KEY OpenRouter 300+ models from any provider
export ANTHROPIC_API_KEY=sk-ant-...
kollab

That's it. No config files needed.

OpenAI OAuth (ChatGPT subscription)

Use your existing ChatGPT Plus/Pro account -- no API key needed:

kollab --login openai

Opens your browser, you authorize, and you're in. Uses the Responses API with your subscription quota.

Custom Profiles

For more control, create named profiles with env vars following the pattern KOLLABOR_{NAME}_{FIELD}:

# Local LLM via Ollama
KOLLABOR_LOCAL_PROVIDER=custom
KOLLABOR_LOCAL_BASE_URL=http://localhost:11434/v1
KOLLABOR_LOCAL_MODEL=llama3.1

kollab --profile local

Use /profile interactively to list, switch, and create profiles. See FEATURES.md for all configuration options.

Pipe Mode

kollab "What is the capital of France?"        # direct query
echo "Explain this code" | kollab -p           # from stdin
cat document.txt | kollab -p                   # from file
kollab --timeout 5min "Complex analysis task"  # with timeout

Providers

Provider Type How to Connect
Anthropic Native ANTHROPIC_API_KEY
OpenAI Native OPENAI_API_KEY or kollab --login openai
Google Gemini Native GEMINI_API_KEY
Azure OpenAI Native KOLLABOR_AZURE_* env vars
OpenRouter Gateway OPENROUTER_API_KEY (300+ models)
Ollama Custom KOLLABOR_LOCAL_BASE_URL=http://localhost:11434/v1
LM Studio Custom KOLLABOR_LMSTUDIO_BASE_URL=http://localhost:1234/v1
Any OpenAI-compatible Custom KOLLABOR_{NAME}_BASE_URL=...

Hooks & Plugins

Every stage of the pipeline is hookable:

user input → pre_user_input → pre_api_request → [LLM API] → post_api_response → pre_message_display → output
                                                    ↓
                                              pre_tool_use → [tool execution] → post_tool_use

Write a plugin:

from kollabor_plugins import BasePlugin
from kollabor_events import EventType, HookPriority

class MyPlugin(BasePlugin):
    def register_hooks(self):
        self.event_bus.register_hook(
            EventType.PRE_API_REQUEST,
            self.on_request,
            priority=HookPriority.NORMAL
        )

    async def on_request(self, context):
        # inject custom headers, modify messages, add tools, whatever
        context["headers"]["X-Custom"] = "value"
        return context

Drop it in plugins/ and it loads automatically. Plugins can register slash commands, add status bar widgets, merge config, and hook into 30+ event types.

Agent Hub

Agents auto-discover each other and form a peer-to-peer mesh. They see who's online, send messages, and coordinate work -- all through the open channel.

kollab --agent jarvis                   # launch with agent identity
kollab --agent coder --designation ruby  # override hub designation

Hub CLI

Manage the mesh without opening a TUI:

kollab --hub status                     # who's online
kollab --hub capture jarvis 100         # last 100 lines of agent output
kollab --hub kill lapis                 # remote shutdown (clean exit)
kollab --hub msg jarvis "fix the auth"  # send a message
kollab --attach jarvis                  # stream agent output (read-only)

Hub Features

  • Gem designations -- 24 gem-inspired names with color castes (lapis, peridot, ruby...)
  • Open channel -- all agents see all messages (like a team chat)
  • Vaults -- persistent memory across sessions (stream, working memory, crystallized insights)
  • Dreaming -- idle agents review their vault and crystallize knowledge automatically
  • Task Ledger -- compaction-proof task persistence with QA review flow (assign, checkpoint, complete, QA approve/reject)
  • Hub Cron -- schedule recurring messages to agents on intervals (e.g. every 5m, 1h)
  • Telegram Bridge -- bidirectional messaging with voice transcription via local whisper
  • Skill routing -- coordinator matches tasks to agents by capability
  • Organizations -- launch entire teams from JSON org charts
  • User broadcasting -- when you talk to one agent, all peers see it
  • Remote kill -- shut down any agent on the mesh from the CLI or another agent

Telegram Bridge

Talk to your agents from your phone. Bidirectional -- send messages, receive responses, voice notes transcribed locally via whisper.

# Setup
export KOLLABOR_HUB_BRIDGE_TOKEN=your-bot-token    # from @BotFather
export KOLLABOR_HUB_BRIDGE_CHAT_ID=your-chat-id    # from @userinfobot
kollab --agent jarvis
/hub bridge setup     # auto-detects and sends test message

Everything jarvis sees appears on telegram. Agent arrivals, departures, hub chatter, responses -- all forwarded to your phone.

See Telegram Bridge Setup for the full walkthrough.

Hub Slash Commands

Command Description
/hub status Show online agents
/hub whoami Show your designation
/hub msg <agent> <text> Send a message
/hub broadcast <text> Broadcast to all agents
/hub kill <agent> Remote shutdown (clean exit)
/hub console Agent management UI (sidebar + feed)
/hub feed Live dashboard
/hub spawn <type> <task> Spawn a sub-agent
/hub capture <agent> Capture agent output
/hub org <name> Launch an organization
/hub cron add|list|delete|clear Schedule recurring messages
/hub tasks list|mine|assign|cancel|status Task management with QA flow
/hub bridge status|send|enable|disable|setup Telegram bridge controls
/hub vault [name] Show vault info for an agent
/hub vaults List all agent vaults

Slash Commands

Command Description
/profile List, switch, create LLM profiles
/save Save conversation (markdown, jsonl, clipboard)
/hub Agent mesh hub (status, msg, kill, console, feed, spawn)
/terminal Manage tmux sessions
/permissions Configure tool approval modes
/login OAuth login (OpenAI)
/mcp Manage MCP servers
/resume Resume a previous conversation
/config Fullscreen settings editor
/help Show all available commands

Type / in the app to see the full command menu with 20+ commands.

Architecture

Kollabor is a monorepo. Each package is independently versioned and installable.

Package What it does
kollabor-ai LLM providers, profiles, OAuth, streaming
kollabor-agent Tool execution, MCP, permissions
kollabor-tui Terminal UI, rendering, design system
kollabor-events Event bus, hook registry
kollabor-config Configuration system
kollabor-plugins Plugin framework, SDK
kollabor-engine Web UI backend

The kollabor/ directory is a thin orchestration layer that wires the packages together.

.
├── kollabor/                  # Orchestration (app lifecycle, CLI, commands)
├── packages/                  # Independent packages (see table above)
├── plugins/                   # Plugin implementations
├── tests/                     # Test suite
└── main.py                    # Entry point

Development

git clone https://github.com/kollaborai/kollabor-cli.git
cd kollabor-cli
pip install -e ".[dev]"
python main.py
python tests/run_tests.py                    # all tests
python -m black kollabor/ plugins/ tests/    # format
python -m mypy kollabor/ plugins/            # type check

See CLAUDE.md for architecture details, coding standards, and contribution guidelines.

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kollab-0.5.3.tar.gz (2.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kollab-0.5.3-py3-none-any.whl (2.2 MB view details)

Uploaded Python 3

File details

Details for the file kollab-0.5.3.tar.gz.

File metadata

  • Download URL: kollab-0.5.3.tar.gz
  • Upload date:
  • Size: 2.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollab-0.5.3.tar.gz
Algorithm Hash digest
SHA256 358542225253e5b607ae16a4a7526c07e10e116bdfe09ceba88529292814f34c
MD5 8275b14a0e1b1411a4b774c109023c95
BLAKE2b-256 78881a077f1272ca9aa2944218230149dd5b57a1fde1aa5ea2871f4a953aca62

See more details on using hashes here.

File details

Details for the file kollab-0.5.3-py3-none-any.whl.

File metadata

  • Download URL: kollab-0.5.3-py3-none-any.whl
  • Upload date:
  • Size: 2.2 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollab-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 1523ac0d1d36998d72cd1c2cec1fc48981dfadee354d176456dc6065f15d416a
MD5 d4ce7fe58ce967a690f317110ef8fc2e
BLAKE2b-256 757377b5611042a39c3c527cc89e10604093ba6609a06201f332a4ad954cd147

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page