Skip to main content

Otto — AI agent platform

Project description

Otto

Otto

Self-hosted AI agent platform. Telegram bot, MCP tools, scheduled jobs, persistent memory.

PyPI License Python


Otto is a personal AI agent that runs on your machine and talks to you through Telegram. It connects to any LLM via LiteLLM, exposes tools through MCP, runs scheduled jobs, and remembers context across conversations.

No cloud platform. No vendor lock-in. Just pip install otto-agent and go.

Install

pip install otto-agent

Or with uv:

uv tool install otto-agent

Setup

otto setup

The wizard walks you through:

  • Choosing a model (provider/model-name format — Anthropic, OpenAI, Google, Ollama, OpenRouter, etc.)
  • Connecting your Telegram bot token (from @BotFather)
  • Setting an owner ID so only you can use it

Config lives in ~/.otto/.

Usage

Start the Telegram bot:

otto start        # daemonized
otto run          # foreground (for debugging)

Manage the process:

otto status       # check if running
otto stop         # stop the daemon
otto logs         # tail recent logs

Configure the model:

otto config model get
otto config model set openai/gpt-4o
otto config model list

Telegram Commands

Command What it does
/model Switch LLM model
/tools List available tools
/memory Search stored memories
/stop Cancel a running response
/session Start a fresh conversation

/session pin + Telegram Topics (important)

/session pin <Topic Name> binds context to a Telegram Topic thread.

For group chats: Topics must be enabled for the supergroup and the bot must have topic-management rights.

For 1:1 private chats: Telegram requires enabling Topics in private chats for your bot in @BotFather (eligible bots only). If this is off, createForumTopic fails in DM.

BotFather path:

  • @BotFather/mybots → select bot → settings/features
  • Enable Topics in private chats
  • (Optional) Enable user topic create/delete controls

If you don't see this toggle, the bot may be ineligible or Telegram may not have rolled the feature to your account/client path yet (desktop/web BotFather usually exposes new settings first).

Fee note (Telegram Terms): enabling topics in private chats may apply a 15% non-refundable Telegram Stars fee for purchases made in that bot while the feature is enabled.

Discord Channel (Operator Rollout)

Discord setup and rollout runbook:

  • docs/specs/discord-operator-rollout.md
  • docs/specs/discord-channel-architecture.md

Channel config snippet:

[[bots.channels]]
type = "discord"
token = "${DISCORD_BOT_TOKEN}"
enabled = true

Rollout rule: keep Discord in dev/canary until the Discord test suite outcomes are green, then promote phase-by-phase with observability gates and a documented rollback to Telegram.

Features

Multi-backend LLM — Any model supported by LiteLLM: Anthropic, OpenAI, Google, Ollama, OpenRouter, and more. Switch models mid-conversation with /model. Supports OAuth-authenticated models (Claude Code, Google Code, OpenAI Codex).

MCP Tool Gateway — Tools are MCP servers defined in ~/.otto/tools.yaml. Otto connects to them at startup and exposes them to the agent. Includes Workspace Policies for sandboxed file operations (default vs. strict modes).

Persistent Memory — Stores and retrieves context across sessions. Memory is searchable and can persist identity/personality rules.

Agent Orchestration — Otto can delegate tasks to async sub-agents running in parallel. Delegation is fire-and-forget: Otto returns immediately and notifies you via Telegram when the job is done. Sub-agents run with the same tools and model access as the main session. Delegation is contract-based — specify deliverables, constraints, and optional validation commands so results are verified before delivery.

Built-in delegation tools:

  • delegate_task — spawn a background sub-agent with a structured contract
  • list_jobs — inspect status of all delegated jobs
  • cancel_job — cancel a running sub-agent

Scheduled Jobs — Cron-style scheduling built in, with background prompt execution.

Web UI — Built-in dashboard for monitoring status, viewing logs, and managing configuration (default: http://localhost:7070).

Telegram UX — Interactive commands, inline controls, status cards, and chunked delivery for long responses.

File Sending — Send files (PDFs, images, documents) directly to Telegram.

Architecture

You (Telegram / Web)
      │
      ▼
┌─────────────┐
│ Telegram Bot │──── Commands (/model, /tools, /stop, ...)
└──────┬──────┘
       │
┌──────▼──────┐
│   Web UI    │──── Dashboard, monitoring, config
└──────┬──────┘
       ▼
┌─────────────┐
│  Chat Layer  │──── Sessions, memory, system prompt
└──────┬──────┘
       ▼
┌─────────────┐
│    Agent     │──── Tool-calling loop (LiteLLM → any LLM)
└──────┬──────┘
       ▼
┌─────────────┐
│ MCP Gateway  │──── Connects to tool servers defined in tools.yaml
└─────────────┘

Configuration

All config is in ~/.otto/:

File Purpose
config.yaml Model, Telegram token, owner ID, web/workspace settings
tools.yaml MCP tool server definitions
skills/ Custom skill modules
memory.db Persistent memory store
sessions/ Conversation history
logs/ Structured logs
credentials/ OAuth provider tokens

Development

git clone https://github.com/1broseidon/otto.git
cd otto
uv sync --dev
make check         # lint + test
otto web           # start web UI for development

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

otto_agent-0.12.2.tar.gz (296.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

otto_agent-0.12.2-py3-none-any.whl (339.2 kB view details)

Uploaded Python 3

File details

Details for the file otto_agent-0.12.2.tar.gz.

File metadata

  • Download URL: otto_agent-0.12.2.tar.gz
  • Upload date:
  • Size: 296.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for otto_agent-0.12.2.tar.gz
Algorithm Hash digest
SHA256 6d736cff9ea876902d7f76b53c86c2e3e2cfc31368df91086bab29c56c7881a8
MD5 cf4902034af491096d1be6464f4dc1e2
BLAKE2b-256 40cc316e2379c333fc9a23091fb2efae2b9e5eee8c9c4950022f01b9973e819f

See more details on using hashes here.

Provenance

The following attestation bundles were made for otto_agent-0.12.2.tar.gz:

Publisher: release.yml on 1broseidon/otto

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file otto_agent-0.12.2-py3-none-any.whl.

File metadata

  • Download URL: otto_agent-0.12.2-py3-none-any.whl
  • Upload date:
  • Size: 339.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for otto_agent-0.12.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f8ca3b71fc01d7f70466ad1c8af789c85f4af364aebae0324edc78c16cdb0dea
MD5 1d6b693909d36d92739d2fb18d3ddffa
BLAKE2b-256 c7ec292135cc68d6600ab6d8bd4a64d225ed317ebccbcd6059df23fa109d9f52

See more details on using hashes here.

Provenance

The following attestation bundles were made for otto_agent-0.12.2-py3-none-any.whl:

Publisher: release.yml on 1broseidon/otto

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page