Skip to main content

AI Assistant Framework with CLI and Telegram Bot

Project description

Assistants Framework

A flexible framework for creating AI assistants with multiple frontend interfaces.

Features

  • Multi-Front-End Support: CLI and Telegram interfaces built on the same core framework
  • CLI Features: Code highlighting, thread management, editor integration, file input, image generation
  • Multiple LLM Support: OpenAI (gpt-*, o*), Anthropic (claude-*), MistralAI (mistral-*, codestral-*), and image generation (DALL-E)
  • New Universal Assistant Interface: See MIGRATION_GUIDE.md for details
  • MCP (Model Context Protocol) Support: Connect to MCP servers and use their tools in conversations

Installation

Requires Python 3.11+

pip install assistants-framework

For Telegram bot functionality:

pip install assistants-framework[telegram]

Add commands to your PATH:

ai-cli install

Usage

Command Line Interface

ai-cli --help

Key CLI commands (prefixed with /):

  • /help - Show help message
  • /editor - Open editor for prompt composition
  • /image <prompt> - Generate an image
  • /copy - Copy response to clipboard
  • /new - Start new thread
  • /threads - List and select threads
  • /thinking <level> - Toggle thinking mode (for reasoning models)
  • /last - Retrieve last message
  • /mcp - List available MCP servers and tools

File Tagging in Prompts

You can include the contents of files directly in your prompt by tagging them with an @ followed by the file path. Both absolute and relative paths are supported. For example:

Hi Claude, can you check my @~/.zshrc and tell me what you think?
Hi Claude, can you check my @./my_local_config.txt and tell me what you think?

When you use a tag like @/path/to/file.txt or @relative/path/to/file.txt in your prompt, the assistant will automatically append the contents of that file to the end of your input, like this:

Hi Claude, can you check my @./my_local_config.txt and tell me what you think?

===./my_local_config.txt===
// file content
===EOF===
  • You can tag multiple files in a single prompt; each will be appended in the same format.
  • If a file cannot be read, an error message will be shown in place of its content.
  • Both absolute (starting with /) and relative paths (like ./file.txt or subdir/file.txt) are supported for tagging.

Model-Specific Commands

Use the claude command for Anthropic models (Now defaults to Claude 4):

claude -e  # Open editor for Claude

There's also a chatgpt command that uses the default ChatGPT model:

chatgpt -t  # Continue the last thread with ChatGPT (`gpt-4.1-mini`)

Database Management

Run migrations in case of breaking changes:

ai-cli migrate

Rebuild the database:

ai-cli rebuild

MCP (Model Context Protocol) Server Support

The framework supports connecting to MCP servers to extend the assistant's capabilities with external tools. MCP servers are configured via a JSON file at ~/.config/assistants/mcp.json (or wherever $ASSISTANTS_CONFIG_DIR points to).

Configuration Example:

Create ~/.config/assistants/mcp.json:

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
    },
    "brave-search": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-brave-search"],
      "env": {
        "BRAVE_API_KEY": "your-api-key"
      }
    }
  }
}

Using MCP Tools:

  1. List available MCP servers and their tools:

    ai-cli  # Start the CLI
    /mcp    # List servers and tools
    
  2. Enable MCP tools when creating an assistant:

    from assistants.ai.universal import UniversalAssistant
    
    assistant = UniversalAssistant(
        model="gpt-4o",
        enable_mcp_tools=True
    )
    
  3. The assistant will automatically use MCP tools when appropriate during conversations.

Available MCP Servers:

You can find MCP servers at:

Requirements:

Install MCP support:

pip install "mcp[cli]"

Telegram Interface

The framework includes a Telegram bot interface with the following features:

  • User Management: Authorise/deauthorise users and chats, promote/demote users
  • Thread Management: Start new conversation threads
  • Auto-Reply Toggle: Enable/disable automatic responses
  • Media Generation: Generate images from text prompts
  • Voice Responses: Generate audio responses with the /voice command

Key Telegram commands:

  • /new_thread - Clear conversation history and start a new thread
  • /auto_reply - Toggle automatic responses on/off
  • /image <prompt> - Generate an image from a text prompt
  • /voice <text> - Generate an audio response.

Environment Variables

  • ASSISTANT_INSTRUCTIONS - System message (default: "You are a helpful assistant")
  • ASSISTANTS_API_KEY_NAME - API key variable name (default: OPENAI_API_KEY)
  • ANTHROPIC_API_KEY_NAME - Anthropic API key variable (default: ANTHROPIC_API_KEY)
  • MISTRAL_API_KEY_NAME - Mistral API key variable (default: MISTRAL_API_KEY)
  • DEFAULT_MODEL - Default model (default: gpt-5-mini)
  • DEFAULT_CLAUDE_SONNET_MODEL - Default Claude model (default: claude-sonnet-4-20250514)
  • DEFAULT_CLAUDE_OPUS_MODEL - Default Claude Opus model (default: claude-opus-4-1-20250805)
  • DEFAULT_CHATGPT_MODEL - Default ChatGPT model (default: gpt-5-mini)
  • DEFAULT_GPT_REASONING_MODEL - Default GPT reasoning model (default: o4-mini)
  • CODE_MODEL - Reasoning model (default: o4-mini)
  • IMAGE_MODEL - Image model (default: gpt-image-1)
  • ASSISTANTS_DATA_DIR - Data directory (default: ~/.local/share/assistants)
  • ASSISTANTS_CONFIG_DIR - Config directory (default: ~/.config/assistants)
  • TG_BOT_TOKEN - Telegram bot token
  • OPEN_IMAGES_IN_BROWSER - Open images automatically (default: true)
  • DEFAULT_MAX_RESPONSE_TOKENS - Default max response tokens (default: 4096)
  • DEFAULT_MAX_HISTORY_TOKENS - Default max history tokens (default: 10000)

Contributing

Contributions welcome! Fork the repository, make changes, and submit a pull request.

Useful Make Commands

  • make help – Show all available make commands
  • make install – Install package for production
  • make install-dev – Install package with development dependencies
  • make dev-setup – Complete development environment setup
  • make lint – Run all pre-commit hooks on all files
  • make format – Format code with ruff
  • make mypy – Run mypy type checks (baseline)
  • make mypy-generate – Generate a new mypy baseline
  • make test – Run all tests with pytest
  • make clean – Remove build artifacts and cache files
  • make build – Build distribution packages
  • make version – Show current version

TODOs:

  • Improved conversation handling/truncation for token limits - currently uses tiktoken for all models
  • Additional model/API support
  • Additional database support

License

MIT License

Project details


Release history Release notifications | RSS feed

This version

0.9.5

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

assistants_framework-0.9.5.tar.gz (244.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

assistants_framework-0.9.5-py3-none-any.whl (79.8 kB view details)

Uploaded Python 3

File details

Details for the file assistants_framework-0.9.5.tar.gz.

File metadata

  • Download URL: assistants_framework-0.9.5.tar.gz
  • Upload date:
  • Size: 244.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for assistants_framework-0.9.5.tar.gz
Algorithm Hash digest
SHA256 554edb0c93b4a6b586ba6d6ede034b89f813a731f685baa232fe690b8705f2d8
MD5 215deca2bdeaee6f5a94af2a7e46f5fa
BLAKE2b-256 c4724d5e04446f0155a8c6724f7f65bf054a5062aca355146ef7656833a80599

See more details on using hashes here.

File details

Details for the file assistants_framework-0.9.5-py3-none-any.whl.

File metadata

File hashes

Hashes for assistants_framework-0.9.5-py3-none-any.whl
Algorithm Hash digest
SHA256 7839fbde36b04a0bfc1060e4f6eff8556f6b1ed7821a4e2f99beccd40fc03156
MD5 c5f90c6878332a0d1485a5256a403401
BLAKE2b-256 f90e6b2afdbf8d6b5b1e5fc8be6c25ffcac16656ef65c4395341d0eb1fa5f82c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page