Local AI agent with Discord/Telegram, Google Workspace, and payments - powered by Ollama
Project description
Local Pigeon
_ _ ____ _
| | ___ ___ __ _| | | _ \(_) __ _ ___ ___ _ __
| | / _ \ / __/ _` | | | |_) | |/ _` |/ _ \/ _ \| '_ \
| |__| (_) | (_| (_| | | | __/| | (_| | __/ (_) | | | |
|_____\___/ \___\__,_|_| |_| |_|\__, |\___|\___/|_| |_|
|___/
A fully local AI agent powered by Ollama (or llama-cpp-python). Your AI assistant that runs entirely on your device, connecting to Discord, Telegram, or a web interface while keeping all LLM inference local and private.
Features
- Local LLM Inference - Uses Ollama for on-device model inference
- Privacy First - Your conversations never leave your device
- Multi-Platform - Discord, Telegram, and Web UI support
- Extensible Tools - Web search, browser automation, and more
- Browser Automation - Navigate dynamic websites with Playwright (Google Flights, etc.)
- Voice Input - Speech-to-text for hands-free interaction
- Vision Support - Image analysis with vision models (LLaVA, Moondream)
- Model Catalog - Curated inventory of thinking, vision, and coding models
- Google Workspace - Gmail, Calendar, and Drive integration
- Payment Capabilities - Stripe virtual cards and crypto (USDC/ETH)
- Human-in-the-Loop - Approval workflow for sensitive operations
- Activity Dashboard - Track interactions across all platforms
- Easy Setup - One-command installation
Prerequisites
- Python 3.10+ (Download)
- Ollama (Download) - or Local Pigeon can auto-download models via llama-cpp-python
- A supported LLM model (e.g.,
gemma3,llama3.2,qwen2.5)
Quick Start
Option 1: Auto-Installer (Recommended)
Windows (PowerShell):
irm https://raw.githubusercontent.com/tradermichael/local_pigeon/main/install.ps1 | iex
Mac/Linux:
curl -sSL https://raw.githubusercontent.com/tradermichael/local_pigeon/main/install.sh | bash
Option 2: pip Install
pip install local-pigeon
# Optional features:
pip install local-pigeon[browser] # Browser automation (Playwright)
pip install local-pigeon[voice] # Voice input (Speech Recognition)
pip install local-pigeon[all] # Everything
Option 3: From Source
git clone https://github.com/tradermichael/local_pigeon.git
cd local_pigeon
pip install -e .
Option 4: Docker
docker-compose up -d
Configuration
1. Set up Ollama (or skip for auto-download)
If you have Ollama installed, make sure it's running:
# Start Ollama (if not running)
ollama serve
# Pull a model
ollama pull gemma3:latest
No Ollama? Local Pigeon will automatically fall back to llama-cpp-python and download a model from HuggingFace on first run.
2. Configure Local Pigeon
Run the setup wizard:
local-pigeon setup
Or manually create a .env file:
# Ollama
OLLAMA_HOST=http://localhost:11434
OLLAMA_MODEL=gemma3:latest
# Discord (optional)
DISCORD_BOT_TOKEN=your_discord_bot_token
# Telegram (optional)
TELEGRAM_BOT_TOKEN=your_telegram_bot_token
# Google Workspace (optional)
GOOGLE_CLIENT_ID=your_client_id
GOOGLE_CLIENT_SECRET=your_client_secret
# Browser automation (optional)
BROWSER_ENABLED=true
BROWSER_HEADLESS=true # false to see browser window
# Payments (optional)
STRIPE_API_KEY=sk_...
PAYMENT_APPROVAL_THRESHOLD=25.00
3. Run Local Pigeon
# Start all enabled platforms
local-pigeon run
# Or run specific platform
local-pigeon run --platform discord
local-pigeon run --platform telegram
local-pigeon run --platform web
Model Catalog
Local Pigeon includes a curated catalog of models organized by capability:
| Category | Models | Use Case |
|---|---|---|
| Thinking/Reasoning | DeepSeek R1, Qwen 3, Kimi K2 | Complex problem solving, chain-of-thought |
| Vision | LLaVA, Moondream, Llama 3.2 Vision | Image analysis, OCR |
| Coding | Qwen 2.5 Coder, CodeLlama, DeepSeek Coder | Code generation, debugging |
| General | Gemma 3, Llama 3.2, Mistral | General conversation |
| Small/Fast | Qwen 2.5 0.5B-3B, Phi-3 Mini | Quick responses, low resources |
Install models via the Web UI (Settings > Model Discovery) or command line:
# Reasoning model
ollama pull deepseek-r1:7b
# Vision model (for image analysis)
ollama pull llava:7b
# Coding model
ollama pull qwen2.5-coder:7b
Platforms
Discord Bot
- Create a bot at Discord Developer Portal
- Enable "Message Content Intent" under Bot settings
- Copy the bot token to your
.env - Invite the bot to your server with appropriate permissions
Features:
- Responds to mentions and DMs
- Streaming responses with message edits
- Slash commands:
/model,/clear,/status - Payment approval via DM
- Image analysis (send images for vision models to analyze)
Telegram Bot
- Create a bot via @BotFather
- Copy the bot token to your
.env - Optionally set
TELEGRAM_ALLOWED_USERSto restrict access
Features:
- Message handling with user whitelist
- Streaming responses
- Commands:
/model,/clear,/status - Inline keyboard for payment approvals
Web UI
Access at http://localhost:7860 when running with --platform web.
Features:
- Chat interface with streaming
- Tool activity transparency - See which tools the agent uses in real-time
- Voice input (microphone)
- Activity log across all platforms
- Model discovery and installation
- Settings panel
- OAuth setup for Google
Tools
Web Tools
- Web Search - Search using DuckDuckGo
- Web Fetch - Extract content from web pages
- Browser - Full browser automation (Playwright)
- Browser Search - Specialized search tasks (Google Flights, etc.)
Google Workspace
- Gmail - Read, search, and send emails
- Calendar - View and create events
- Drive - List, search, and read files
Payments
- Stripe Card - Virtual card for online payments
- Crypto Wallet - USDC/ETH on Base network
Discord Tools
- Send Messages - Post to channels
- Send DMs - Direct message users
- Get Messages - Read channel history
- Add Reactions - React to messages
- List Channels - See available channels
- Create Threads - Start discussion threads
Payment System
Local Pigeon supports both traditional and crypto payments:
Stripe Virtual Cards
- Create virtual cards for online purchases
- Real-time transaction monitoring
- Human-in-the-loop approval for amounts above threshold
Crypto Wallet (CDP)
- USDC and ETH support on Base network
- Send and receive payments
- Approval workflow for security
Approval Workflow
Payments above your configured threshold (default: $25) require approval:
- Agent requests payment
- You receive approval request (Discord DM, Telegram message, or Web UI)
- Approve or deny within 5 minutes
- Payment proceeds or is cancelled
Configure threshold:
PAYMENT_APPROVAL_THRESHOLD=25.00
Security
- Local Processing - LLM runs on your device via Ollama
- Encrypted Storage - OAuth tokens encrypted at rest
- Human Approval - Sensitive operations require confirmation
- User Whitelist - Restrict bot access to specific users
Project Structure
local_pigeon/
├── src/
│ └── local_pigeon/
│ ├── core/ # Agent, LLM client, conversation, model catalog
│ ├── platforms/ # Discord, Telegram adapters
│ ├── tools/ # Web, Google, Payment, Discord tools
│ │ ├── web/ # Search, fetch, browser automation
│ │ ├── google/ # Gmail, Calendar, Drive
│ │ ├── discord/ # Discord action tools
│ │ └── payments/ # Stripe, crypto wallet
│ ├── storage/ # Database, credentials
│ ├── ui/ # Gradio web interface
│ ├── config.py # Configuration management
│ └── cli.py # Command-line interface
├── config.yaml # YAML configuration
├── .env.example # Environment template
├── install.ps1 # Windows installer
├── install.sh # Mac/Linux installer
├── Dockerfile # Docker build
└── docker-compose.yml # Docker orchestration
Development
Setup Development Environment
git clone https://github.com/tradermichael/local_pigeon.git
cd local_pigeon
pip install -e ".[dev]"
Run Tests
pytest
Code Style
ruff check .
ruff format .
Commands Reference
# Run the agent
local-pigeon run [--platform discord|telegram|web]
# Interactive setup wizard
local-pigeon setup
# Check system status
local-pigeon status
# List available models
local-pigeon models
# Interactive chat (terminal)
local-pigeon chat
# Show version
local-pigeon version
Contributing
Contributions are welcome! Please read our contributing guidelines.
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
License
MIT License - see LICENSE for details.
Acknowledgments
- Ollama - Local LLM runtime
- Playwright - Browser automation
- discord.py - Discord API wrapper
- aiogram - Telegram Bot framework
- Gradio - Web UI framework
- Stripe - Payment processing
- Coinbase CDP - Crypto infrastructure
Made with care for local-first AI
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file local_pigeon-0.4.0.tar.gz.
File metadata
- Download URL: local_pigeon-0.4.0.tar.gz
- Upload date:
- Size: 163.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6e993b3fae8d77b2e7c9f9dbcad757c77aa11b23dfc1c974f174b4da35408409
|
|
| MD5 |
47b6922325db7da565684d35099f675e
|
|
| BLAKE2b-256 |
e4627b11ff4284425788aa1536c5bf2a5316ac2ee5276fd526891d715c77ffa2
|
File details
Details for the file local_pigeon-0.4.0-py3-none-any.whl.
File metadata
- Download URL: local_pigeon-0.4.0-py3-none-any.whl
- Upload date:
- Size: 194.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fe390c88f9781fd88970e7541c8ca2c5dfd0eee56fa319cf66d615f8fdfbab19
|
|
| MD5 |
628f15ed4730d289407744ada8438070
|
|
| BLAKE2b-256 |
1b8c70f53987d8f5cc60a77f0ea27bb7826c41ad70e956d657121e0d066945f7
|