Skip to main content

AI-powered terminal assistant with rich TUI - brings ChatGPT/Claude to your command line

Project description

Consoul Banner

Consoul

AI-Powered Terminal Assistant — Beautiful TUI · Powerful CLI · Flexible SDK

Bring modern AI assistance directly to your terminal. Chat with Claude, GPT-4, Gemini, and local models using a rich interactive interface or simple CLI commands.

📖 Full Documentation


Quick Start

Installation

# Install with TUI (recommended)
pip install 'consoul[tui]'

# Or install SDK/CLI only
pip install consoul

Set Your API Key

# Choose your provider
export ANTHROPIC_API_KEY=your-key-here  # Claude
export OPENAI_API_KEY=your-key-here     # GPT-4
export GOOGLE_API_KEY=your-key-here     # Gemini

Launch the TUI

consoul tui

Use the CLI

git diff | consoul ask --stdin "create a commit message and commit"

Or Use the SDK

from consoul import Consoul

console = Consoul()
print(console.chat("What is 2+2?"))

Features

🎨 Beautiful TUI

Rich, interactive terminal interface powered by Textual

Consoul TUI

  • Multi-turn conversations with streaming responses
  • Conversation history and search
  • File attachments and image analysis
  • Customizable themes (light/dark)
  • Mouse and keyboard navigation

🤖 Multi-Provider Support

Use your favorite AI model or run locally:

  • Anthropic Claude - Claude 4.5 Sonnet, Opus, Haiku
  • OpenAI - GPT-5, GPT-4, GPT-3.5
  • Google Gemini - Gemini 3 Flash, Pro
  • Ollama - Run models locally (Llama, Qwen, GPT-OSS etc.)
  • LlamaCpp - GGUF/MLX models with GPU acceleration

🛠️ AI-Powered Tools

File Editing Let AI create, modify, and delete files with safety controls:

consoul ask "Add error handling to calculate_total in src/utils.py" --tools

Code Search Navigate your codebase semantically:

consoul ask "Find all usages of deprecated_function" --tools

Image Analysis Debug with screenshots:

consoul ask "What's wrong with this error?" --attach screenshot.png

And More:

  • Bash command execution with approval workflows
  • Web search and URL fetching
  • Session management and history

📝 Simple CLI

For quick questions without the TUI:

# One-off questions
consoul ask "Explain Python decorators"

# Interactive chat mode
consoul chat

🔌 SDK Integration

Embed AI capabilities in your Python applications:

from consoul import Consoul

# Enable tools for file operations and code search
console = Consoul(tools=True)

# Stateful conversation
console.chat("List all TODO comments in Python files")
console.chat("Create a summary.md file with the results")

# Rich responses with metadata
response = console.ask("Summarize this project", show_tokens=True)
print(f"Tokens: {response.tokens}")
print(f"Cost: ${console.last_cost['total_cost']:.4f}")

Documentation

📖 Full Documentation

Getting Started:

TUI:

Tools:

SDK:


Configuration

Create ~/.consoul/config.yaml:

# Default profile
profiles:
  default:
    model:
      provider: anthropic
      model: claude-3-5-sonnet-20241022
      temperature: 0.7

    tools:
      enabled: true
      permission_policy: balanced  # Require approval for risky operations

    conversation:
      save_history: true
      max_history: 50

  # Local model profile
  local:
    model:
      provider: ollama
      model: llama3.2:latest

Switch profiles:

consoul --profile local

See the Configuration Guide for all options.


Security

Consoul includes comprehensive security controls for tool execution:

  • Permission Policies - PARANOID, BALANCED, TRUSTING, UNRESTRICTED
  • Approval Workflows - Interactive confirmation for dangerous operations
  • Audit Logging - Complete execution history in JSONL format
  • Command Validation - Pattern-based blocking of risky commands

See the Security Policy for best practices.


Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.


License

MIT License - see LICENSE file for details.


Built With


Made with ❤️ by GoatBytes.IO

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

consoul-0.4.1.tar.gz (442.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

consoul-0.4.1-py3-none-any.whl (546.9 kB view details)

Uploaded Python 3

File details

Details for the file consoul-0.4.1.tar.gz.

File metadata

  • Download URL: consoul-0.4.1.tar.gz
  • Upload date:
  • Size: 442.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.4.1.tar.gz
Algorithm Hash digest
SHA256 4036d4805852ebd912284a0f83e59419ed0f779e2f02056c5b8a586370e56265
MD5 cc707737e2dfd7793462624f748b773e
BLAKE2b-256 2a5fdb2d6e7381e2569260a456477666b094ecbab90eb2a6cc024b337bd0692a

See more details on using hashes here.

File details

Details for the file consoul-0.4.1-py3-none-any.whl.

File metadata

  • Download URL: consoul-0.4.1-py3-none-any.whl
  • Upload date:
  • Size: 546.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 59e0e02bd31879af0de073b5e29bec66fa029e688a8e9aef40e0f2a3f5a93be1
MD5 d8d993adae5cac4507df40b1632bbd58
BLAKE2b-256 361e893fd4bef664a188149201509f706ae08338998ea668377abf5da20691ff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page