Skip to main content

AI-powered terminal assistant with rich TUI - brings ChatGPT/Claude to your command line

Project description

Consoul Banner

Consoul

AI-Powered Terminal Assistant — Beautiful TUI · Powerful CLI · Flexible SDK

Bring modern AI assistance directly to your terminal. Chat with Claude, GPT-4, Gemini, and local models using a rich interactive interface or simple CLI commands.

📖 Full Documentation


Quick Start

Installation

# Install with TUI (recommended)
pip install 'consoul[tui]'

# Or install SDK/CLI only
pip install consoul

Set Your API Key

# Choose your provider
export ANTHROPIC_API_KEY=your-key-here  # Claude
export OPENAI_API_KEY=your-key-here     # GPT-4
export GOOGLE_API_KEY=your-key-here     # Gemini

Launch the TUI

consoul tui

Use the CLI

git diff | consoul ask --stdin "create a commit message and commit"

Or Use the SDK

from consoul import Consoul

console = Consoul()
print(console.chat("What is 2+2?"))

Features

🎨 Beautiful TUI

Rich, interactive terminal interface powered by Textual

Consoul TUI

  • Multi-turn conversations with streaming responses
  • Conversation history and search
  • File attachments and image analysis
  • Customizable themes (light/dark)
  • Mouse and keyboard navigation

🤖 Multi-Provider Support

Use your favorite AI model or run locally:

  • Anthropic Claude - Claude 4.5 Sonnet, Opus, Haiku
  • OpenAI - GPT-5, GPT-4, GPT-3.5
  • Google Gemini - Gemini 3 Flash, Pro
  • Ollama - Run models locally (Llama, Qwen, GPT-OSS etc.)
  • LlamaCpp - GGUF/MLX models with GPU acceleration

🛠️ AI-Powered Tools

File Editing Let AI create, modify, and delete files with safety controls:

consoul ask "Add error handling to calculate_total in src/utils.py" --tools

Code Search Navigate your codebase semantically:

consoul ask "Find all usages of deprecated_function" --tools

Image Analysis Debug with screenshots:

consoul ask "What's wrong with this error?" --attach screenshot.png

And More:

  • Bash command execution with approval workflows
  • Web search and URL fetching
  • Session management and history

📝 Simple CLI

For quick questions without the TUI:

# One-off questions
consoul ask "Explain Python decorators"

# Interactive chat mode
consoul chat

🔌 SDK Integration

Embed AI capabilities in your Python applications:

from consoul import Consoul

# Enable tools for file operations and code search
console = Consoul(tools=True)

# Stateful conversation
console.chat("List all TODO comments in Python files")
console.chat("Create a summary.md file with the results")

# Rich responses with metadata
response = console.ask("Summarize this project", show_tokens=True)
print(f"Tokens: {response.tokens}")
print(f"Cost: ${console.last_cost['total_cost']:.4f}")

Documentation

📖 Full Documentation

Getting Started:

TUI:

Tools:

SDK:


Configuration

Create ~/.consoul/config.yaml:

# Default profile
profiles:
  default:
    model:
      provider: anthropic
      model: claude-3-5-sonnet-20241022
      temperature: 0.7

    tools:
      enabled: true
      permission_policy: balanced  # Require approval for risky operations

    conversation:
      save_history: true
      max_history: 50

  # Local model profile
  local:
    model:
      provider: ollama
      model: llama3.2:latest

Switch profiles:

consoul --profile local

See the Configuration Guide for all options.


Security

Consoul includes comprehensive security controls for tool execution:

  • Permission Policies - PARANOID, BALANCED, TRUSTING, UNRESTRICTED
  • Approval Workflows - Interactive confirmation for dangerous operations
  • Audit Logging - Complete execution history in JSONL format
  • Command Validation - Pattern-based blocking of risky commands

See the Security Policy for best practices.


Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.


License

MIT License - see LICENSE file for details.


Built With


Made with ❤️ by GoatBytes.IO

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

consoul-0.3.0.tar.gz (374.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

consoul-0.3.0-py3-none-any.whl (446.5 kB view details)

Uploaded Python 3

File details

Details for the file consoul-0.3.0.tar.gz.

File metadata

  • Download URL: consoul-0.3.0.tar.gz
  • Upload date:
  • Size: 374.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.3.0.tar.gz
Algorithm Hash digest
SHA256 4862d75fa387eb30eb820b31279b4dcb37e713dc79bc9afbaa3a1a4570463c77
MD5 77ea0b9d1b2634de098f225934d22d5d
BLAKE2b-256 4cc46933d26c3561af1e1c45b10ab464ccca9062e17430ed15fb2d30544534a3

See more details on using hashes here.

File details

Details for the file consoul-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: consoul-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 446.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 694b3397b6386a480aa831c32160b6dd20b12b317bf9dabecbe7cf22b2078b5c
MD5 029af81577b79249355e95add9b50458
BLAKE2b-256 f89f69293c6a2b12605d1f1af5b59e047ce3f17740b9906b9dd602e606690d8f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page