A terminal AI chat where everything has hooks
Project description
Kollabor
A terminal AI chat where everything has hooks. Every action -- user input, API calls, responses, tool use, rendering -- triggers hooks that plugins can intercept and modify. Use any LLM provider, build custom plugins, automate with pipe mode.
brew install kollaborai/tap/kollabor # macOS
kollab # start chatting
Why Kollabor
Most terminal AI tools are closed boxes. Kollabor is an open pipeline -- every stage is a hook you can tap into.
- Hook everything -- intercept user input, transform API requests, post-process responses, modify tool calls
- Plugin system -- drop a Python file in
plugins/, it auto-discovers and loads - Any provider -- OpenAI, Anthropic, Google, Azure, OpenRouter, Ollama, LM Studio, or any OpenAI-compatible endpoint
- OAuth login -- use your ChatGPT subscription directly with
kollab --login openai - MCP support -- Model Context Protocol for tool integration
- Pipe mode --
echo "query" | kollab -pfor scripting and automation - Agent system -- multi-step task execution with tool calling
- Slash commands --
/profile,/save,/terminal,/permissions, and more
Install
brew install kollaborai/tap/kollabor # macOS (recommended)
curl -sS https://raw.githubusercontent.com/kollaborai/kollabor-cli/main/install.sh | bash # cross-platform
Or manually: uv tool install kollabor / pipx install kollabor / pip install kollabor
Quick Start
Kollabor auto-detects your API keys from standard environment variables:
| Environment Variable | Provider | Notes |
|---|---|---|
ANTHROPIC_API_KEY |
Anthropic | Claude models |
OPENAI_API_KEY |
OpenAI | GPT models |
GEMINI_API_KEY |
Gemini models | |
OPENROUTER_API_KEY |
OpenRouter | 300+ models from any provider |
export ANTHROPIC_API_KEY=sk-ant-...
kollab
That's it. No config files needed.
OpenAI OAuth (ChatGPT subscription)
Use your existing ChatGPT Plus/Pro account -- no API key needed:
kollab --login openai
Opens your browser, you authorize, and you're in. Uses the Responses API with your subscription quota.
Custom Profiles
For more control, create named profiles with env vars following the pattern KOLLABOR_{NAME}_{FIELD}:
# Local LLM via Ollama
KOLLABOR_LOCAL_PROVIDER=custom
KOLLABOR_LOCAL_BASE_URL=http://localhost:11434/v1
KOLLABOR_LOCAL_MODEL=llama3.1
kollab --profile local
Use /profile interactively to list, switch, and create profiles. See FEATURES.md for all configuration options.
Pipe Mode
kollab "What is the capital of France?" # direct query
echo "Explain this code" | kollab -p # from stdin
cat document.txt | kollab -p # from file
kollab --timeout 5min "Complex analysis task" # with timeout
Providers
| Provider | Type | How to Connect |
|---|---|---|
| Anthropic | Native | ANTHROPIC_API_KEY |
| OpenAI | Native | OPENAI_API_KEY or kollab --login openai |
| Google Gemini | Native | GEMINI_API_KEY |
| Azure OpenAI | Native | KOLLABOR_AZURE_* env vars |
| OpenRouter | Gateway | OPENROUTER_API_KEY (300+ models) |
| Ollama | Custom | KOLLABOR_LOCAL_BASE_URL=http://localhost:11434/v1 |
| LM Studio | Custom | KOLLABOR_LMSTUDIO_BASE_URL=http://localhost:1234/v1 |
| Any OpenAI-compatible | Custom | KOLLABOR_{NAME}_BASE_URL=... |
Hooks & Plugins
Every stage of the pipeline is hookable:
user input → pre_user_input → pre_api_request → [LLM API] → post_api_response → pre_message_display → output
↓
pre_tool_use → [tool execution] → post_tool_use
Write a plugin:
from kollabor_plugins import BasePlugin
from kollabor_events import EventType, HookPriority
class MyPlugin(BasePlugin):
def register_hooks(self):
self.event_bus.register_hook(
EventType.PRE_API_REQUEST,
self.on_request,
priority=HookPriority.NORMAL
)
async def on_request(self, context):
# inject custom headers, modify messages, add tools, whatever
context["headers"]["X-Custom"] = "value"
return context
Drop it in plugins/ and it loads automatically. Plugins can register slash commands, add status bar widgets, merge config, and hook into 30+ event types.
Slash Commands
| Command | Description |
|---|---|
/profile |
List, switch, create LLM profiles |
/save |
Save conversation (markdown, jsonl, clipboard) |
/terminal |
Manage tmux sessions |
/permissions |
Configure tool approval modes |
/login |
OAuth login (OpenAI) |
/mcp |
Manage MCP servers |
/resume |
Resume a previous conversation |
/help |
Show all available commands |
Type / in the app to see the full command menu with 20+ commands.
Architecture
Kollabor is a monorepo. Each package is independently versioned and installable.
| Package | What it does |
|---|---|
| kollabor-ai | LLM providers, profiles, OAuth, streaming |
| kollabor-agent | Tool execution, MCP, permissions |
| kollabor-tui | Terminal UI, rendering, design system |
| kollabor-events | Event bus, hook registry |
| kollabor-config | Configuration system |
| kollabor-plugins | Plugin framework, SDK |
| kollabor-engine | Web UI backend |
The kollabor/ directory is a thin orchestration layer that wires the packages together.
.
├── kollabor/ # Orchestration (app lifecycle, CLI, commands)
├── packages/ # Independent packages (see table above)
├── plugins/ # Plugin implementations
├── tests/ # Test suite
└── main.py # Entry point
Development
git clone https://github.com/kollaborai/kollabor-cli.git
cd kollabor-cli
pip install -e ".[dev]"
python main.py
python tests/run_tests.py # all tests
python -m black kollabor/ plugins/ tests/ # format
python -m mypy kollabor/ plugins/ # type check
See CLAUDE.md for architecture details, coding standards, and contribution guidelines.
Links
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file kollabor-0.4.23.tar.gz.
File metadata
- Download URL: kollabor-0.4.23.tar.gz
- Upload date:
- Size: 861.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5dff4a5760bab6d12361e4916e0d75cc0bc4a230740e189137aa4a2d969e9ef4
|
|
| MD5 |
6ca5f23e66af71c13ef224dab6923f50
|
|
| BLAKE2b-256 |
27e06231aa616ad0da7240b40f0fefefa2590029752fde1db678b690587470df
|
File details
Details for the file kollabor-0.4.23-py3-none-any.whl.
File metadata
- Download URL: kollabor-0.4.23-py3-none-any.whl
- Upload date:
- Size: 927.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f91a6d5587f08547ac1060f6f975df220a31be848992beae940c594f39b745e
|
|
| MD5 |
183588e46dca609aa9bb035924bb0f78
|
|
| BLAKE2b-256 |
9dbe1651322e9d1c754dcb7edec7781c45654d4f43237d9425378aad0b754b29
|