Local AI agent with Discord/Telegram, Google Workspace, and payments - powered by Ollama
Project description
๏ฟฝ๏ธ Local Pigeon
A fully local AI agent powered by Ollama (or llama-cpp-python). Your AI assistant that runs entirely on your device, connecting to Discord, Telegram, or a web interface while keeping all LLM inference local and private.
โจ Features
- ๐ง Local LLM Inference - Uses Ollama for on-device model inference
- ๐ Privacy First - Your conversations never leave your device
- ๐ฌ Multi-Platform - Discord, Telegram, and Web UI support
- ๐ง Extensible Tools - Web search, file operations, and more
- ๐ง Google Workspace - Gmail, Calendar, and Drive integration
- ๐ณ Payment Capabilities - Stripe virtual cards and crypto (USDC/ETH)
- โ Human-in-the-Loop - Approval workflow for sensitive operations
- ๐ Easy Setup - One-command installation
๐ Prerequisites
- Python 3.10+ (Download)
- Ollama (Download) - or Local Pigeon can auto-download models via llama-cpp-python
- A supported LLM model (e.g.,
gemma3,llama3.2,qwen2.5)
๐ Quick Start
Option 1: Auto-Installer (Recommended)
Windows (PowerShell):
irm https://raw.githubusercontent.com/tradermichael/local_pigeon/main/install.ps1 | iex
Mac/Linux:
curl -sSL https://raw.githubusercontent.com/tradermichael/local_pigeon/main/install.sh | bash
Option 2: From Source (Recommended)
git clone https://github.com/tradermichael/local_pigeon.git
cd local_pigeon
pip install -e .
Option 3: pip Install (coming soon)
# Not yet published to PyPI
pip install local-pigeon
Option 4: Docker
docker-compose up -d
โ๏ธ Configuration
1. Set up Ollama (or skip for auto-download)
If you have Ollama installed, make sure it's running:
# Start Ollama (if not running)
ollama serve
# Pull a model
ollama pull gemma3:latest
No Ollama? Local Pigeon will automatically fall back to llama-cpp-python and download a model from HuggingFace on first run.
2. Configure Local Pigeon
Run the setup wizard:
local-pigeon setup
Or manually create a .env file:
# Ollama
OLLAMA_HOST=http://localhost:11434
OLLAMA_MODEL=gemma3:latest
# Discord (optional)
DISCORD_BOT_TOKEN=your_discord_bot_token
# Telegram (optional)
TELEGRAM_BOT_TOKEN=your_telegram_bot_token
# Google Workspace (optional)
GOOGLE_CLIENT_ID=your_client_id
GOOGLE_CLIENT_SECRET=your_client_secret
# Payments (optional)
STRIPE_API_KEY=sk_...
PAYMENT_APPROVAL_THRESHOLD=25.00
3. Run Local Pigeon
# Start all enabled platforms
local-pigeon run
# Or run specific platform
local-pigeon run --platform discord
local-pigeon run --platform telegram
local-pigeon run --platform web
๐ฌ Platforms
Discord Bot
- Create a bot at Discord Developer Portal
- Enable "Message Content Intent" under Bot settings
- Copy the bot token to your
.env - Invite the bot to your server with appropriate permissions
Features:
- Responds to mentions and DMs
- Streaming responses with message edits
- Slash commands:
/model,/clear,/status - Payment approval via DM
Telegram Bot
- Create a bot via @BotFather
- Copy the bot token to your
.env - Optionally set
TELEGRAM_ALLOWED_USERSto restrict access
Features:
- Message handling with user whitelist
- Streaming responses
- Commands:
/model,/clear,/status - Inline keyboard for payment approvals
Web UI
Access at http://localhost:7860 when running with --platform web.
Features:
- Chat interface with streaming
- Settings panel
- OAuth setup for Google
- Tool execution display
๐งฐ Tools
Web Tools
- Web Search - Search using DuckDuckGo
- Web Fetch - Extract content from web pages
Google Workspace
- Gmail - Read, search, and send emails
- Calendar - View and create events
- Drive - List, search, and read files
Payments
- Stripe Card - Virtual card for online payments
- Crypto Wallet - USDC/ETH on Base network
๐ณ Payment System
Local Pigeon supports both traditional and crypto payments:
Stripe Virtual Cards
- Create virtual cards for online purchases
- Real-time transaction monitoring
- Human-in-the-loop approval for amounts above threshold
Crypto Wallet (CDP)
- USDC and ETH support on Base network
- Send and receive payments
- Approval workflow for security
Approval Workflow
Payments above your configured threshold (default: $25) require approval:
- Agent requests payment
- You receive approval request (Discord DM, Telegram message, or Web UI)
- Approve or deny within 5 minutes
- Payment proceeds or is cancelled
Configure threshold:
PAYMENT_APPROVAL_THRESHOLD=25.00
๐ Security
- Local Processing - LLM runs on your device via Ollama
- Encrypted Storage - OAuth tokens encrypted at rest
- Human Approval - Sensitive operations require confirmation
- User Whitelist - Restrict bot access to specific users
๐ Project Structure
local_pigeon/
โโโ src/
โ โโโ local_pigeon/
โ โโโ core/ # Agent, LLM client, conversation
โ โโโ platforms/ # Discord, Telegram adapters
โ โโโ tools/ # Web, Google, Payment tools
โ โโโ storage/ # Database, credentials
โ โโโ ui/ # Gradio web interface
โ โโโ config.py # Configuration management
โ โโโ cli.py # Command-line interface
โโโ config.yaml # YAML configuration
โโโ .env.example # Environment template
โโโ install.ps1 # Windows installer
โโโ install.sh # Mac/Linux installer
โโโ Dockerfile # Docker build
โโโ docker-compose.yml # Docker orchestration
๐ ๏ธ Development
Setup Development Environment
git clone https://github.com/tradermichael/local_pigeon.git
cd local_pigeon
pip install -e ".[dev]"
Run Tests
pytest
Code Style
ruff check .
ruff format .
๐ Commands Reference
# Run the agent
local-pigeon run [--platform discord|telegram|web]
# Interactive setup wizard
local-pigeon setup
# Check system status
local-pigeon status
# List available models
local-pigeon models
# Interactive chat (terminal)
local-pigeon chat
# Show version
local-pigeon version
๐ค Contributing
Contributions are welcome! Please read our contributing guidelines.
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
๐ License
MIT License - see LICENSE for details.
๐ Acknowledgments
- Ollama - Local LLM runtime
- discord.py - Discord API wrapper
- aiogram - Telegram Bot framework
- Gradio - Web UI framework
- Stripe - Payment processing
- Coinbase CDP - Crypto infrastructure
Made with โค๏ธ for local-first AI
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file local_pigeon-0.1.0.tar.gz.
File metadata
- Download URL: local_pigeon-0.1.0.tar.gz
- Upload date:
- Size: 62.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2a74742f806c0d548a3d6b8c88e855394b3200dae884757be5c6ae7f3a8d782a
|
|
| MD5 |
657173c1c56539cd80c062aa8a7e9399
|
|
| BLAKE2b-256 |
154d3020fc06fe9eefea1ae08fae002a7d73cd1d1568309f6eac331fbbf666bc
|
File details
Details for the file local_pigeon-0.1.0-py3-none-any.whl.
File metadata
- Download URL: local_pigeon-0.1.0-py3-none-any.whl
- Upload date:
- Size: 82.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0c6eaa0eb9461696aa95df4be1c969478f5376f2d0dd9967c16ef35f83e421e0
|
|
| MD5 |
6f53195c3432f0f8d1abe6d5169354e1
|
|
| BLAKE2b-256 |
63f6b5c86173d9fafe87bce29b3791f3c2494255c884195700f9bd87e515a497
|