Skip to main content

Your Personal AI assistant for any OS and any platform

Project description

OpenBotX — Personal AI Assistant

OpenBotX Logo

PyPI version Python 3.11-3.13 License: MIT

OpenBotX is a Python-based personal AI assistant powered by artificial intelligence. It uses skills to define behavior, tools to execute actions, gateways for communication, and providers to connect external services and AI models.

Features

  • Multiple Gateways: CLI, WebSocket, Telegram, HTTP API
  • Skills System: Define AI capabilities in Markdown files
  • Unlimited Tools: Register Python functions as tools for the AI
  • MCP Support: Model Context Protocol integration
  • Scheduling: Cron jobs and one-time scheduled tasks
  • Memory: Persistent conversation history per channel
  • Security: Built-in prompt injection detection
  • API: Full REST API for all operations
  • Providers: Modular architecture - LLM, storage, database, transcription, TTS

Installation

Prerequisites

Python Version: This project requires Python 3.11 to 3.13 (3.14 is not yet supported due to dependencies).

Install uv - a fast Python package installer and resolver:

# macOS/Linux (recommended)
curl -LsSf https://astral.sh/uv/install.sh | sh

# macOS with Homebrew
brew install uv

# Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"

# Or with pipx (if you have it)
pipx install uv

From PyPI (Users)

# Install globally as a tool (recommended)
uv tool install openbotx

# Or install in current environment
uv pip install openbotx

All features included: Telegram, Audio (Whisper), S3, MCP, Screenshot, and more.

From Source (Developers)

# Clone and setup
git clone https://github.com/openbotx/openbotx.git
cd openbotx
make setup

# Activate virtual environment
source .venv/bin/activate

Quick Start

For Users (after installation)

# Create a new project from starter template
mkdir my-bot
cd my-bot
openbotx init

# Edit .env file in your project directory
nano .env

# Start the bot in CLI mode
openbotx start --cli-mode

# Or start the API server
openbotx start

The starter template includes:

  • Pre-configured config.yml with basic data
  • Example skills in the skills/ directory
  • Environment template (.env.example)
  • Ready-to-use folder structure

For Developers (from source)

# First time setup
make setup

# Activate virtual environment
source .venv/bin/activate

# Create a project with the starter template
mkdir my-project
cd my-project
openbotx init

# Edit .env file in your project directory
nano .env

# Start the bot in CLI mode
openbotx start --cli-mode

# Or start the API server
openbotx start

Project Structure

Package Structure

openbotx/
├── openbotx/               # Main package (library code)
│   ├── core/               # Core components (orchestrator, message bus, etc.)
│   ├── providers/          # Provider implementations (LLM, gateway, storage)
│   ├── api/                # FastAPI REST API
│   ├── cli/                # CLI commands
│   ├── models/             # Pydantic data models
│   ├── agent/              # PydanticAI agent
│   └── tools/              # Built-in tools
├── docs/                   # Documentation
├── tests/                  # Test suite
├── pyproject.toml          # Package configuration
├── Makefile                # Development commands
└── README.md

Configuration

When you run openbotx init, it creates a config.yml:

version: "1.0.0"

bot:
  name: "My Assistant"
  description: "AI Assistant powered by OpenBotX"

llm:
  provider: "anthropic"  # Any PydanticAI supported provider
  model: "claude-sonnet-4-20250514"
  # Optional: max_tokens, temperature, top_p, timeout, etc

gateways:
  cli:
    enabled: true
  websocket:
    enabled: true
    port: 8765
  telegram:
    enabled: false
    token: "${TELEGRAM_BOT_TOKEN}"
    allowed_users: []

api:
  host: "0.0.0.0"
  port: 8000

Creating Skills

OpenBotX includes native skills (like screenshot) built into the package. You can also create custom skills in your project's skills/ directory, and these can override native skills.

Skills are Markdown files with YAML frontmatter:

---
name: code-review
description: Review code for quality and best practices
version: "1.0.0"
triggers:
  - review code
  - code review
  - check my code
tools:
  - read_file
---

# Code Review Skill

## Steps
1. Read the code file(s) provided
2. Analyze for common issues
3. Check coding standards
4. Suggest improvements

## Guidelines
- Be constructive, not critical
- Explain why changes are suggested
- Prioritize security issues

Creating Tools

Tools are Python functions that the AI can call:

from openbotx.core.tools_registry import tool

@tool(
    name="calculate",
    description="Perform mathematical calculations",
)
def tool_calculate(expression: str) -> str:
    """Calculate a math expression."""
    return str(eval(expression))

CLI Commands

openbotx init               # Initialize from starter template
openbotx init --force       # Overwrite existing files
openbotx start              # Start the API server
openbotx start --cli-mode   # Start in interactive CLI mode
openbotx status             # Show server status
openbotx skills list        # List all skills
openbotx providers list     # List all providers
openbotx send "Hello!"      # Send a message
openbotx config             # Show configuration
openbotx version            # Show version

API Endpoints

Endpoint Description
POST /api/messages Send a message
GET /api/skills List all skills
GET /api/tools List all tools
GET /api/providers List providers
POST /api/scheduler/cron Create cron job
GET /api/memory/{channel} Get conversation history
GET /api/system/health Health check

Development Commands (Makefile)

# Setup & Installation
make setup          # First time setup with uv (venv + deps)
make dev-install    # Install in editable mode with dev dependencies
make install        # Install in production mode

# Testing & Quality
make test           # Run tests
make test-cov       # Run tests with coverage
make lint           # Run linter
make format         # Format code
make check          # Run all checks (lint + type check)

# Building & Publishing
make build          # Build package
make publish-test   # Publish to TestPyPI
make publish        # Publish to PyPI

# Versioning
make version        # Show current version
make bump-patch     # Bump patch version (0.0.X)
make bump-minor     # Bump minor version (0.X.0)
make bump-major     # Bump major version (X.0.0)

# Cleanup
make clean          # Clean build artifacts
make clean-venv     # Remove virtual environment
make reset          # Reset environment (clean venv + setup)

Architecture

Gateway → MessageBus → Orchestrator → Agent → Response → Gateway
              ↓              ↓
           Security      Skills/Tools
              ↓              ↓
           Context        Memory

Providers:

  • LLM: Multiple providers supported (see Configuration)
  • Gateway: CLI, WebSocket, Telegram, HTTP
  • Storage: Local filesystem, S3
  • Database: SQLite
  • Scheduler: Cron, one-time schedules
  • Transcription: Multiple providers supported
  • TTS: Multiple providers supported

Documentation

For detailed documentation, see:

License

MIT License - see LICENSE for details.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openbotx-0.0.1.tar.gz (82.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openbotx-0.0.1-py3-none-any.whl (110.4 kB view details)

Uploaded Python 3

File details

Details for the file openbotx-0.0.1.tar.gz.

File metadata

  • Download URL: openbotx-0.0.1.tar.gz
  • Upload date:
  • Size: 82.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for openbotx-0.0.1.tar.gz
Algorithm Hash digest
SHA256 c57770f008aac3ca4f712e6efeca3370a75d45614f2bfc6829a4b8be8adc9930
MD5 3749d817aa41e16817ca15075c26461c
BLAKE2b-256 6900aa6f759a1cadb803e4cec07a5054c73af8008c61ba1417474c262d752e13

See more details on using hashes here.

File details

Details for the file openbotx-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: openbotx-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 110.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for openbotx-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a3474acaccb527f540fd5c193468b9a71239a5fb6ed9321ffc7313e28babb49d
MD5 92fd4cc2fd1da682f820872cc62ebfb7
BLAKE2b-256 e49241bbbd48f5dad78feb305c661b1fc5fa29dc5d335c2d77c6085f64e36c99

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page