A lightweight personal AI assistant framework
Project description
Banabot: Ultra-Lightweight Personal AI Assistant
๐ banabot is an ultra-lightweight personal AI assistant โ a fork of nanobot
โก๏ธ Delivers core agent functionality in just ~4,000 lines of code โ 99% smaller than Clawdbot's 430k+ lines.
๐ Real-time line count: 3,761 lines (run bash core_agent_lines.sh to verify anytime)
๐ข News
- 2026-02-19 ๐ banabot v0.2.0 released! Fork of nanobot with multi-provider web search and complete rebranding.
- 2026-02-19 ๐ Multi-provider web search: DuckDuckGo (free, no API key), Brave, Tavily, Serper, SearXNG.
- 2026-02-19 ๐จ Complete rebranding: new logo ๐, CLI command
banabot, config path~/.banabot.
Historical news (from nanobot)
- 2026-02-17 ๐ Released v0.1.4 โ MCP support, progress streaming, new providers. See nanobot releases.
- 2026-02-14 ๐ MCP support added! See MCP section for details.
- 2026-02-09 ๐ฌ Added Slack, Email, and QQ support.
- 2026-02-02 ๐ nanobot officially launched!
Key Features of banabot:
๐ชถ Ultra-Lightweight: Just ~4,000 lines of core agent code โ 99% smaller than Clawdbot.
๐ฌ Research-Ready: Clean, readable code that's easy to understand, modify, and extend for research.
โก๏ธ Lightning Fast: Minimal footprint means faster startup, lower resource usage, and quicker iterations.
๐ Easy-to-Use: One-click to deploy and you're ready to go.
๐๏ธ Architecture
โจ Features
๐ 24/7 Real-Time Market Analysis |
๐ Full-Stack Software Engineer |
๐ Smart Daily Routine Manager |
๐ Personal Knowledge Assistant |
|---|---|---|---|
| Discovery โข Insights โข Trends | Develop โข Deploy โข Scale | Schedule โข Automate โข Organize | Learn โข Memory โข Reasoning |
๐ฆ Install
Install from source (latest features, recommended for development)
git clone https://github.com/Mrbanano/banabot.git
cd banabot
pip install -e .
Install with uv (stable, fast)
uv tool install banabot-ai
Install from PyPI (stable)
pip install banabot-ai
๐ Quick Start
[!TIP] Set your API key in
~/.banabot/config.json. Get API keys: OpenRouter (Global) ยท Web search works out-of-the-box with DuckDuckGo (free)
1. Initialize
banabot onboard
2. Configure (~/.banabot/config.json)
Add or merge these two parts into your config (other options have defaults).
Set your API key (e.g. OpenRouter, recommended for global users):
{
"providers": {
"openrouter": {
"apiKey": "sk-or-v1-xxx"
}
}
}
Set your model:
{
"agents": {
"defaults": {
"model": "anthropic/claude-opus-4-5"
}
}
}
3. Chat
banabot agent
That's it! You have a working AI assistant in 2 minutes.
๐ฌ Chat Apps
Connect banabot to your favorite chat platform.
| Channel | What you need |
|---|---|
| Telegram | Bot token from @BotFather |
| Discord | Bot token + Message Content intent |
| QR code scan | |
| Feishu | App ID + App Secret |
| Mochat | Claw token (auto-setup available) |
| DingTalk | App Key + App Secret |
| Slack | Bot token + App-Level token |
| IMAP/SMTP credentials | |
| App ID + App Secret |
Telegram (Recommended)
1. Create a bot
- Open Telegram, search
@BotFather - Send
/newbot, follow prompts - Copy the token
2. Configure
{
"channels": {
"telegram": {
"enabled": true,
"token": "YOUR_BOT_TOKEN",
"allowFrom": ["YOUR_USER_ID"]
}
}
}
You can find your User ID in Telegram settings. It is shown as
@yourUserId. Copy this value without the@symbol and paste it into the config file.
3. Run
banabot gateway
Mochat (Claw IM)
Uses Socket.IO WebSocket by default, with HTTP polling fallback.
1. Ask banabot to set up Mochat for you
Simply send this message to banabot (replace xxx@xxx with your real email):
Read https://raw.githubusercontent.com/HKUDS/MoChat/refs/heads/main/skills/nanobot/skill.md and register on MoChat. My Email account is xxx@xxx Bind me as your owner and DM me on MoChat.
banabot will automatically register, configure ~/.banabot/config.json, and connect to Mochat.
2. Restart gateway
banabot gateway
That's it โ banabot handles the rest!
Manual configuration (advanced)
If you prefer to configure manually, add the following to ~/.banabot/config.json:
Keep
claw_tokenprivate. It should only be sent inX-Claw-Tokenheader to your Mochat API endpoint.
{
"channels": {
"mochat": {
"enabled": true,
"base_url": "https://mochat.io",
"socket_url": "https://mochat.io",
"socket_path": "/socket.io",
"claw_token": "claw_xxx",
"agent_user_id": "6982abcdef",
"sessions": ["*"],
"panels": ["*"],
"reply_delay_mode": "non-mention",
"reply_delay_ms": 120000
}
}
}
Discord
1. Create a bot
- Go to https://discord.com/developers/applications
- Create an application โ Bot โ Add Bot
- Copy the bot token
2. Enable intents
- In the Bot settings, enable MESSAGE CONTENT INTENT
- (Optional) Enable SERVER MEMBERS INTENT if you plan to use allow lists based on member data
3. Get your User ID
- Discord Settings โ Advanced โ enable Developer Mode
- Right-click your avatar โ Copy User ID
4. Configure
{
"channels": {
"discord": {
"enabled": true,
"token": "YOUR_BOT_TOKEN",
"allowFrom": ["YOUR_USER_ID"]
}
}
}
5. Invite the bot
- OAuth2 โ URL Generator
- Scopes:
bot - Bot Permissions:
Send Messages,Read Message History - Open the generated invite URL and add the bot to your server
6. Run
banabot gateway
Requires Node.js โฅ18.
1. Link device
banabot channels login
# Scan QR with WhatsApp โ Settings โ Linked Devices
2. Configure
{
"channels": {
"whatsapp": {
"enabled": true,
"allowFrom": ["+1234567890"]
}
}
}
3. Run (two terminals)
# Terminal 1
banabot channels login
# Terminal 2
banabot gateway
Feishu (้ฃไนฆ)
Uses WebSocket long connection โ no public IP required.
1. Create a Feishu bot
- Visit Feishu Open Platform
- Create a new app โ Enable Bot capability
- Permissions: Add
im:message(send messages) - Events: Add
im.message.receive_v1(receive messages)- Select Long Connection mode (requires running banabot first to establish connection)
- Get App ID and App Secret from "Credentials & Basic Info"
- Publish the app
2. Configure
{
"channels": {
"feishu": {
"enabled": true,
"appId": "cli_xxx",
"appSecret": "xxx",
"encryptKey": "",
"verificationToken": "",
"allowFrom": []
}
}
}
encryptKeyandverificationTokenare optional for Long Connection mode.allowFrom: Leave empty to allow all users, or add["ou_xxx"]to restrict access.
3. Run
banabot gateway
[!TIP] Feishu uses WebSocket to receive messages โ no webhook or public IP needed!
QQ (QQๅ่)
Uses botpy SDK with WebSocket โ no public IP required. Currently supports private messages only.
1. Register & create bot
- Visit QQ Open Platform โ Register as a developer (personal or enterprise)
- Create a new bot application
- Go to ๅผๅ่ฎพ็ฝฎ (Developer Settings) โ copy AppID and AppSecret
2. Set up sandbox for testing
- In the bot management console, find ๆฒ็ฎฑ้ ็ฝฎ (Sandbox Config)
- Under ๅจๆถๆฏๅ่กจ้ ็ฝฎ, click ๆทปๅ ๆๅ and add your own QQ number
- Once added, scan the bot's QR code with mobile QQ โ open the bot profile โ tap "ๅๆถๆฏ" to start chatting
3. Configure
allowFrom: Leave empty for public access, or add user openids to restrict. You can find openids in the banabot logs when a user messages the bot.- For production: submit a review in the bot console and publish. See QQ Bot Docs for the full publishing flow.
{
"channels": {
"qq": {
"enabled": true,
"appId": "YOUR_APP_ID",
"secret": "YOUR_APP_SECRET",
"allowFrom": []
}
}
}
4. Run
banabot gateway
Now send a message to the bot from QQ โ it should respond!
DingTalk (้้)
Uses Stream Mode โ no public IP required.
1. Create a DingTalk bot
- Visit DingTalk Open Platform
- Create a new app -> Add Robot capability
- Configuration:
- Toggle Stream Mode ON
- Permissions: Add necessary permissions for sending messages
- Get AppKey (Client ID) and AppSecret (Client Secret) from "Credentials"
- Publish the app
2. Configure
{
"channels": {
"dingtalk": {
"enabled": true,
"clientId": "YOUR_APP_KEY",
"clientSecret": "YOUR_APP_SECRET",
"allowFrom": []
}
}
}
allowFrom: Leave empty to allow all users, or add["staffId"]to restrict access.
3. Run
banabot gateway
Slack
Uses Socket Mode โ no public URL required.
1. Create a Slack app
- Go to Slack API โ Create New App โ "From scratch"
- Pick a name and select your workspace
2. Configure the app
- Socket Mode: Toggle ON โ Generate an App-Level Token with
connections:writescope โ copy it (xapp-...) - OAuth & Permissions: Add bot scopes:
chat:write,reactions:write,app_mentions:read - Event Subscriptions: Toggle ON โ Subscribe to bot events:
message.im,message.channels,app_mentionโ Save Changes - App Home: Scroll to Show Tabs โ Enable Messages Tab โ Check "Allow users to send Slash commands and messages from the messages tab"
- Install App: Click Install to Workspace โ Authorize โ copy the Bot Token (
xoxb-...)
3. Configure banabot
{
"channels": {
"slack": {
"enabled": true,
"botToken": "xoxb-...",
"appToken": "xapp-...",
"groupPolicy": "mention"
}
}
}
4. Run
banabot gateway
DM the bot directly or @mention it in a channel โ it should respond!
[!TIP]
groupPolicy:"mention"(default โ respond only when @mentioned),"open"(respond to all channel messages), or"allowlist"(restrict to specific channels).- DM policy defaults to open. Set
"dm": {"enabled": false}to disable DMs.
Give banabot its own email account. It polls IMAP for incoming mail and replies via SMTP โ like a personal email assistant.
1. Get credentials (Gmail example)
- Create a dedicated Gmail account for your bot (e.g.
my-nanobot@gmail.com) - Enable 2-Step Verification โ Create an App Password
- Use this app password for both IMAP and SMTP
2. Configure
consentGrantedmust betrueto allow mailbox access. This is a safety gate โ setfalseto fully disable.allowFrom: Leave empty to accept emails from anyone, or restrict to specific senders.smtpUseTlsandsmtpUseSsldefault totrue/falserespectively, which is correct for Gmail (port 587 + STARTTLS). No need to set them explicitly.- Set
"autoReplyEnabled": falseif you only want to read/analyze emails without sending automatic replies.
{
"channels": {
"email": {
"enabled": true,
"consentGranted": true,
"imapHost": "imap.gmail.com",
"imapPort": 993,
"imapUsername": "my-nanobot@gmail.com",
"imapPassword": "your-app-password",
"smtpHost": "smtp.gmail.com",
"smtpPort": 587,
"smtpUsername": "my-nanobot@gmail.com",
"smtpPassword": "your-app-password",
"fromAddress": "my-nanobot@gmail.com",
"allowFrom": ["your-real-email@gmail.com"]
}
}
}
3. Run
banabot gateway
๐ Agent Social Network
๐ banabot is capable of linking to the agent social network (agent community). Just send one message and your banabot joins automatically!
| Platform | How to Join (send this message to your bot) |
|---|---|
| Moltbook | Read https://moltbook.com/skill.md and follow the instructions to join Moltbook |
| ClawdChat | Read https://clawdchat.ai/skill.md and follow the instructions to join ClawdChat |
Simply send the command above to your banabot (via CLI or any chat channel), and it will handle the rest.
โ๏ธ Configuration
Config file: ~/.banabot/config.json
Providers
[!TIP]
- Groq provides free voice transcription via Whisper. If configured, Telegram voice messages will be automatically transcribed.
- Zhipu Coding Plan: If you're on Zhipu's coding plan, set
"apiBase": "https://open.bigmodel.cn/api/coding/paas/v4"in your zhipu provider config.- MiniMax (Mainland China): If your API key is from MiniMax's mainland China platform (minimaxi.com), set
"apiBase": "https://api.minimaxi.com/v1"in your minimax provider config.
| Provider | Purpose | Get API Key |
|---|---|---|
custom |
Any OpenAI-compatible endpoint (direct, no LiteLLM) | โ |
openrouter |
LLM (recommended, access to all models) | openrouter.ai |
anthropic |
LLM (Claude direct) | console.anthropic.com |
openai |
LLM (GPT direct) | platform.openai.com |
deepseek |
LLM (DeepSeek direct) | platform.deepseek.com |
groq |
LLM + Voice transcription (Whisper) | console.groq.com |
gemini |
LLM (Gemini direct) | aistudio.google.com |
minimax |
LLM (MiniMax direct) | platform.minimax.io |
aihubmix |
LLM (API gateway, access to all models) | aihubmix.com |
siliconflow |
LLM (SiliconFlow/็ก ๅบๆตๅจ, API gateway) | siliconflow.cn |
dashscope |
LLM (Qwen) | dashscope.console.aliyun.com |
moonshot |
LLM (Moonshot/Kimi) | platform.moonshot.cn |
zhipu |
LLM (Zhipu GLM) | open.bigmodel.cn |
vllm |
LLM (local, any OpenAI-compatible server) | โ |
openai_codex |
LLM (Codex, OAuth) | banabot provider login openai-codex |
github_copilot |
LLM (GitHub Copilot, OAuth) | banabot provider login github-copilot |
OpenAI Codex (OAuth)
Codex uses OAuth instead of API keys. Requires a ChatGPT Plus or Pro account.
1. Login:
banabot provider login openai-codex
2. Set model (merge into ~/.banabot/config.json):
{
"agents": {
"defaults": {
"model": "openai-codex/gpt-5.1-codex"
}
}
}
3. Chat:
banabot agent -m "Hello!"
Docker users: use
docker run -itfor interactive OAuth login.
Custom Provider (Any OpenAI-compatible API)
Connects directly to any OpenAI-compatible endpoint โ LM Studio, llama.cpp, Together AI, Fireworks, Azure OpenAI, or any self-hosted server. Bypasses LiteLLM; model name is passed as-is.
{
"providers": {
"custom": {
"apiKey": "your-api-key",
"apiBase": "https://api.your-provider.com/v1"
}
},
"agents": {
"defaults": {
"model": "your-model-name"
}
}
}
For local servers that don't require a key, set
apiKeyto any non-empty string (e.g."no-key").
vLLM (local / OpenAI-compatible)
Run your own model with vLLM or any OpenAI-compatible server, then add to config:
1. Start the server (example):
vllm serve meta-llama/Llama-3.1-8B-Instruct --port 8000
2. Add to config (partial โ merge into ~/.banabot/config.json):
Provider (key can be any non-empty string for local):
{
"providers": {
"vllm": {
"apiKey": "dummy",
"apiBase": "http://localhost:8000/v1"
}
}
}
Model:
{
"agents": {
"defaults": {
"model": "meta-llama/Llama-3.1-8B-Instruct"
}
}
}
Adding a New Provider (Developer Guide)
banabot uses a Provider Registry (banabot/providers/registry.py) as the single source of truth.
Adding a new provider only takes 2 steps โ no if-elif chains to touch.
Step 1. Add a ProviderSpec entry to PROVIDERS in banabot/providers/registry.py:
ProviderSpec(
name="myprovider", # config field name
keywords=("myprovider", "mymodel"), # model-name keywords for auto-matching
env_key="MYPROVIDER_API_KEY", # env var for LiteLLM
display_name="My Provider", # shown in `banabot status`
litellm_prefix="myprovider", # auto-prefix: model โ myprovider/model
skip_prefixes=("myprovider/",), # don't double-prefix
)
Step 2. Add a field to ProvidersConfig in banabot/config/schema.py:
class ProvidersConfig(BaseModel):
...
myprovider: ProviderConfig = ProviderConfig()
That's it! Environment variables, model prefixing, config matching, and banabot status display will all work automatically.
Common ProviderSpec options:
| Field | Description | Example |
|---|---|---|
litellm_prefix |
Auto-prefix model names for LiteLLM | "dashscope" โ dashscope/qwen-max |
skip_prefixes |
Don't prefix if model already starts with these | ("dashscope/", "openrouter/") |
env_extras |
Additional env vars to set | (("ZHIPUAI_API_KEY", "{api_key}"),) |
model_overrides |
Per-model parameter overrides | (("kimi-k2.5", {"temperature": 1.0}),) |
is_gateway |
Can route any model (like OpenRouter) | True |
detect_by_key_prefix |
Detect gateway by API key prefix | "sk-or-" |
detect_by_base_keyword |
Detect gateway by API base URL | "openrouter" |
strip_model_prefix |
Strip existing prefix before re-prefixing | True (for AiHubMix) |
MCP (Model Context Protocol)
[!TIP] The config format is compatible with Claude Desktop / Cursor. You can copy MCP server configs directly from any MCP server's README.
banabot supports MCP โ connect external tool servers and use them as native agent tools.
Add MCP servers to your config.json:
{
"tools": {
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/dir"]
}
}
}
}
Two transport modes are supported:
| Mode | Config | Example |
|---|---|---|
| Stdio | command + args |
Local process via npx / uvx |
| HTTP | url |
Remote endpoint (https://mcp.example.com/sse) |
MCP tools are automatically discovered and registered on startup. The LLM can use them alongside built-in tools โ no extra configuration needed.
Security
[!TIP] For production deployments, set
"restrictToWorkspace": truein your config to sandbox the agent.
| Option | Default | Description |
|---|---|---|
tools.restrictToWorkspace |
false |
When true, restricts all agent tools (shell, file read/write/edit, list) to the workspace directory. Prevents path traversal and out-of-scope access. |
channels.*.allowFrom |
[] (allow all) |
Whitelist of user IDs. Empty = allow everyone; non-empty = only listed users can interact. |
Web Search
banabot supports multiple search providers โ works out-of-the-box with DuckDuckGo (free, no API key required).
| Provider | API Key | Get Key |
|---|---|---|
duckduckgo (default) |
No | โ |
brave |
Yes | Brave Search API |
tavily |
Yes | Tavily |
serper |
Yes | Serper |
searxng |
No (self-hosted) | SearXNG |
Configuration (~/.banabot/config.json):
{
"tools": {
"web": {
"search": {
"defaultProvider": "duckduckgo",
"maxResults": 5,
"providers": {
"brave": { "apiKey": "YOUR_KEY", "enabled": true },
"duckduckgo": { "enabled": true },
"tavily": { "apiKey": "YOUR_KEY", "enabled": false },
"serper": { "apiKey": "YOUR_KEY", "enabled": false },
"searxng": { "apiBase": "http://localhost:8080", "enabled": false }
}
}
}
}
}
If no defaultProvider is set, uses DuckDuckGo (free). Set defaultProvider to use a different provider by default.
CLI Reference
| Command | Description |
|---|---|
banabot onboard |
Initialize config & workspace |
banabot agent -m "..." |
Chat with the agent |
banabot agent |
Interactive chat mode |
banabot agent --no-markdown |
Show plain-text replies |
banabot agent --logs |
Show runtime logs during chat |
banabot gateway |
Start the gateway |
banabot status |
Show status |
banabot provider login openai-codex |
OAuth login for providers |
banabot channels login |
Link WhatsApp (scan QR) |
banabot channels status |
Show channel status |
Interactive mode exits: exit, quit, /exit, /quit, :q, or Ctrl+D.
Scheduled Tasks (Cron)
# Add a job
banabot cron add --name "daily" --message "Good morning!" --cron "0 9 * * *"
banabot cron add --name "hourly" --message "Check status" --every 3600
# List jobs
banabot cron list
# Remove a job
banabot cron remove <job_id>
๐ณ Docker
[!TIP] The
-v ~/.banabot:/root/.banabotflag mounts your local config directory into the container, so your config and workspace persist across container restarts.
Docker Compose
docker compose run --rm banabot-cli onboard # first-time setup
vim ~/.banabot/config.json # add API keys
docker compose up -d banabot-gateway # start gateway
docker compose run --rm banabot-cli agent -m "Hello!" # run CLI
docker compose logs -f banabot-gateway # view logs
docker compose down # stop
Docker
# Build the image
docker build -t banabot .
# Initialize config (first time only)
docker run -v ~/.banabot:/root/.banabot --rm banabot onboard
# Edit config on host to add API keys
vim ~/.banabot/config.json
# Run gateway (connects to enabled channels, e.g. Telegram/Discord/Mochat)
docker run -v ~/.banabot:/root/.banabot -p 18790:18790 banabot gateway
# Or run a single command
docker run -v ~/.banabot:/root/.banabot --rm banabot agent -m "Hello!"
docker run -v ~/.banabot:/root/.banabot --rm banabot status
๐ Project Structure
banabot/
โโโ agent/ # ๐ง Core agent logic
โ โโโ loop.py # Agent loop (LLM โ tool execution)
โ โโโ context.py # Prompt builder
โ โโโ memory.py # Persistent memory
โ โโโ skills.py # Skills loader
โ โโโ subagent.py # Background task execution
โ โโโ tools/ # Built-in tools (incl. spawn)
โโโ skills/ # ๐ฏ Bundled skills (github, weather, tmux...)
โโโ channels/ # ๐ฑ Chat channel integrations
โโโ bus/ # ๐ Message routing
โโโ cron/ # โฐ Scheduled tasks
โโโ heartbeat/ # ๐ Proactive wake-up
โโโ providers/ # ๐ค LLM providers (OpenRouter, etc.)
โโโ session/ # ๐ฌ Conversation sessions
โโโ config/ # โ๏ธ Configuration
โโโ cli/ # ๐ฅ๏ธ Commands
๐ ๏ธ Development Guide
Prerequisites
- Python 3.11+
- Node.js 20+ (only needed for WhatsApp bridge)
- Git
Setup
# 1. Clone the repo
git clone https://github.com/Mrbanano/banabot.git
cd banabot
# 2. Create and activate a virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# 3. Install in editable mode with dev dependencies
pip install -e ".[dev]"
# 4. Initialize config & workspace
banabot onboard
# 5. Add an API key to ~/.banabot/config.json (e.g. OpenRouter)
# {
# "providers": {
# "openrouter": { "apiKey": "sk-or-v1-xxx" }
# }
# }
# 6. Verify everything works
banabot status
banabot agent -m "Hello!"
Running Tests
# Run all tests
pytest
# Run a specific test file
pytest tests/test_commands.py
# Run a specific test function
pytest tests/test_commands.py::test_onboard_fresh_install
# Verbose output
pytest -v
Tests use pytest-asyncio (auto mode) for async tests and unittest.mock for mocking config/paths.
Linting & Formatting
The project uses Ruff for both linting and formatting.
# Check for lint errors
ruff check banabot/
# Auto-fix lint errors
ruff check --fix banabot/
# Format code
ruff format banabot/
Rules configured: E (pycodestyle), F (Pyflakes), I (isort), N (naming), W (whitespace). Line length: 100 chars.
Debugging
# Run agent with runtime logs visible
banabot agent -m "test" --logs
# Run gateway in verbose mode
banabot gateway --verbose
Building the WhatsApp Bridge (optional)
Only needed if you're working on WhatsApp integration:
cd bridge
npm install
npm run build
Key Extension Points
Adding a New Tool
- Create
banabot/agent/tools/mytool.pyextending theToolbase class - Implement
name,description,parameters(JSON schema), andexecute(**kwargs) - Register it in the
AgentLooptool setup
from nanobot.agent.tools.base import Tool
class MyTool(Tool):
name = "my_tool"
description = "Does something useful"
parameters = {
"type": "object",
"properties": {
"input": {"type": "string", "description": "The input value"}
},
"required": ["input"]
}
async def execute(self, **kwargs):
return f"Result: {kwargs['input']}"
Adding a New Channel
- Create
banabot/channels/myservice.pyextendingChannel - Implement
start(),stop(), and message sending logic - Subscribe to the inbound message bus
- Add a config class to
banabot/config/schema.py - Register in
ChannelManager.start_all()
Creating a Custom Skill
Skills are Markdown files that give the agent domain-specific instructions:
- Create
~/.banabot/workspace/skills/myskill/SKILL.md - Write instructions, examples, and notes in Markdown
- The agent will auto-discover and use it
See banabot/skills/README.md for the full skill format.
Architecture Overview
| Component | Path | Role |
|---|---|---|
| Agent Loop | banabot/agent/loop.py |
Core LLM โ tool execution cycle |
| Context Builder | banabot/agent/context.py |
Assembles prompts from workspace files |
| Memory | banabot/agent/memory.py |
Two-layer: MEMORY.md (facts) + HISTORY.md (events) |
| Message Bus | banabot/bus/ |
Async inbound/outbound queues decoupling channels from agent |
| Provider Registry | banabot/providers/registry.py |
Single registry for 18+ LLM providers |
| Session Manager | banabot/session/manager.py |
JSONL-based per-channel conversation storage |
| Tool Registry | banabot/agent/tools/registry.py |
Manages built-in + MCP tools |
| Channel Manager | banabot/channels/manager.py |
Starts/stops all enabled channel integrations |
| Cron Service | banabot/cron/service.py |
Scheduled task execution (cron, interval, one-time) |
| Config Schema | banabot/config/schema.py |
Pydantic models for all config sections |
PR Workflow
# Create a feature branch
git checkout -b feature/my-feature
# Make changes, then lint and test
ruff check --fix banabot/
ruff format banabot/
pytest
# Commit and push
git add .
git commit -m "feat: description of change"
git push origin feature/my-feature
Then open a PR against main. Use conventional commit prefixes: feat:, fix:, docs:, chore:, refactor:.
๐ค Contribute & Roadmap
PRs welcome! The codebase is intentionally small and readable. ๐ค
Roadmap โ Pick an item and open a PR!
- Multi-modal โ See and hear (images, voice, video)
- Long-term memory โ Never forget important context
- Better reasoning โ Multi-step planning and reflection
- More integrations โ Calendar and more
- Self-improvement โ Learn from feedback and mistakes
Contributors
banabot is a fork of nanobot. We thank the original contributors:
See CREDITS.md for full attribution.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file banabot_ai-0.3.0.tar.gz.
File metadata
- Download URL: banabot_ai-0.3.0.tar.gz
- Upload date:
- Size: 136.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9035765ca828ac394f5306a07f99d6517ba683e00dd059353e9fd8e021c45592
|
|
| MD5 |
c4a7850effb965a84d11864b3de079e8
|
|
| BLAKE2b-256 |
0856c3899a489fa37ca3a134ebf8407ebf39b81c1c8c7122a41c095e2503b6fb
|
File details
Details for the file banabot_ai-0.3.0-py3-none-any.whl.
File metadata
- Download URL: banabot_ai-0.3.0-py3-none-any.whl
- Upload date:
- Size: 159.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d7f88d29395fab825a3434df714e5497c0093bb91a4e0f33c68c9f80198e77ec
|
|
| MD5 |
c6b8b0273f551fb34534fe7cd09ef8f4
|
|
| BLAKE2b-256 |
aeb836feb818907326b59a11feec325a78274922809b1172a98df9ac40cd7c77
|