Skip to main content

AI-powered terminal assistant with rich TUI - brings ChatGPT/Claude to your command line

Project description

Consoul Banner

Consoul

AI-Powered Terminal Assistant — Beautiful TUI · Powerful CLI · Flexible SDK

Bring modern AI assistance directly to your terminal. Chat with Claude, GPT-4, Gemini, and local models using a rich interactive interface or simple CLI commands.

📖 Full Documentation | 🚀 Quick Start | 🎨 Features


Quick Start

Installation

# Install with TUI (recommended)
pip install consoul[tui]

# Or install SDK/CLI only
pip install consoul

Set Your API Key

# Choose your provider
export ANTHROPIC_API_KEY=your-key-here  # Claude
export OPENAI_API_KEY=your-key-here     # GPT-4
export GOOGLE_API_KEY=your-key-here     # Gemini

Launch the TUI

consoul tui

Or Use the SDK

from consoul import Consoul

console = Consoul()
print(console.chat("What is 2+2?"))

Features

🎨 Beautiful TUI

Rich, interactive terminal interface powered by Textual

Consoul TUI

  • Multi-turn conversations with streaming responses
  • Conversation history and search
  • File attachments and image analysis
  • Customizable themes (light/dark)
  • Mouse and keyboard navigation

🤖 Multi-Provider Support

Use your favorite AI model or run locally:

  • Anthropic Claude - Claude 4.5 Sonnet, Opus, Haiku
  • OpenAI - GPT-5, GPT-4, GPT-3.5
  • Google Gemini - Gemini 3 Flash, Pro
  • Ollama - Run models locally (Llama, Qwen, GPT-OSS etc.)
  • LlamaCpp - GGUF/MLX models with GPU acceleration

🛠️ AI-Powered Tools

File Editing Let AI create, modify, and delete files with safety controls:

consoul ask "Add error handling to calculate_total in src/utils.py" --tools

Code Search Navigate your codebase semantically:

consoul ask "Find all usages of deprecated_function" --tools

Image Analysis Debug with screenshots:

consoul ask "What's wrong with this error?" --attach screenshot.png

And More:

  • Bash command execution with approval workflows
  • Web search and URL fetching
  • Session management and history

📝 Simple CLI

For quick questions without the TUI:

# One-off questions
consoul ask "Explain Python decorators"

# Interactive chat mode
consoul chat

🔌 SDK Integration

Embed AI capabilities in your Python applications:

from consoul import Consoul

# Enable tools for file operations and code search
console = Consoul(tools=True)

# Stateful conversation
console.chat("List all TODO comments in Python files")
console.chat("Create a summary.md file with the results")

# Rich responses with metadata
response = console.ask("Summarize this project", show_tokens=True)
print(f"Tokens: {response.tokens}")
print(f"Cost: ${console.last_cost['total_cost']:.4f}")

Documentation

📖 Full Documentation

Getting Started:

TUI:

Tools:

SDK:


Configuration

Create ~/.consoul/config.yaml:

# Default profile
profiles:
  default:
    model:
      provider: anthropic
      model: claude-3-5-sonnet-20241022
      temperature: 0.7

    tools:
      enabled: true
      permission_policy: balanced  # Require approval for risky operations

    conversation:
      save_history: true
      max_history: 50

  # Local model profile
  local:
    model:
      provider: ollama
      model: llama3.2:latest

Switch profiles:

consoul --profile local

See the Configuration Guide for all options.


Security

Consoul includes comprehensive security controls for tool execution:

  • Permission Policies - PARANOID, BALANCED, TRUSTING, UNRESTRICTED
  • Approval Workflows - Interactive confirmation for dangerous operations
  • Audit Logging - Complete execution history in JSONL format
  • Command Validation - Pattern-based blocking of risky commands

See the Security Policy for best practices.


Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.


License

MIT License - see LICENSE file for details.


Built With


Made with ❤️ by GoatBytes.IO

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

consoul-0.2.2.tar.gz (362.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

consoul-0.2.2-py3-none-any.whl (433.4 kB view details)

Uploaded Python 3

File details

Details for the file consoul-0.2.2.tar.gz.

File metadata

  • Download URL: consoul-0.2.2.tar.gz
  • Upload date:
  • Size: 362.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.2.2.tar.gz
Algorithm Hash digest
SHA256 bc2e70074e2ad7769036cf380cf2f6d5dc04d20a9f84d639466fd6a9c7e75a0a
MD5 c31319ffe03b801186ea85ff510392ac
BLAKE2b-256 0d85daebe739e7ca29b494d3998e65631df51e3899a1f1d9dfed8d071db694d7

See more details on using hashes here.

File details

Details for the file consoul-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: consoul-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 433.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 18a5f13e5ef166988996e3cff7c5e9b8a0c97c656b205352348f2578dfac35e8
MD5 a0afff6f19086579d3f46e9a947cd81f
BLAKE2b-256 9b532e07b0b3c0f30af39ea59055badf978715ffdb0c6b52eee56430e9a746fc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page