Skip to main content

AI-powered terminal assistant with rich TUI - brings ChatGPT/Claude to your command line

Project description

Consoul Banner

Consoul

AI-Powered Terminal Assistant — Beautiful TUI · Powerful CLI · Flexible SDK

Bring modern AI assistance directly to your terminal. Chat with Claude, GPT-4, Gemini, and local models using a rich interactive interface or simple CLI commands.

📖 Full Documentation


Quick Start

Installation

# Install with TUI (recommended)
pip install 'consoul[tui]'

# Or install SDK/CLI only
pip install consoul

Set Your API Key

# Choose your provider
export ANTHROPIC_API_KEY=your-key-here  # Claude
export OPENAI_API_KEY=your-key-here     # GPT-4
export GOOGLE_API_KEY=your-key-here     # Gemini

Launch the TUI

consoul tui

Use the CLI

git diff | consoul ask --stdin "create a commit message and commit"

Or Use the SDK

from consoul import Consoul

console = Consoul()
print(console.chat("What is 2+2?"))

Features

🎨 Beautiful TUI

Rich, interactive terminal interface powered by Textual

Consoul TUI

  • Multi-turn conversations with streaming responses
  • Conversation history and search
  • File attachments and image analysis
  • Customizable themes (light/dark)
  • Mouse and keyboard navigation

🤖 Multi-Provider Support

Use your favorite AI model or run locally:

  • Anthropic Claude - Claude 4.5 Sonnet, Opus, Haiku
  • OpenAI - GPT-5, GPT-4, GPT-3.5
  • Google Gemini - Gemini 3 Flash, Pro
  • Ollama - Run models locally (Llama, Qwen, GPT-OSS etc.)
  • LlamaCpp - GGUF/MLX models with GPU acceleration

🛠️ AI-Powered Tools

File Editing Let AI create, modify, and delete files with safety controls:

consoul ask "Add error handling to calculate_total in src/utils.py" --tools

Code Search Navigate your codebase semantically:

consoul ask "Find all usages of deprecated_function" --tools

Image Analysis Debug with screenshots:

consoul ask "What's wrong with this error?" --attach screenshot.png

And More:

  • Bash command execution with approval workflows
  • Web search and URL fetching
  • Session management and history

📝 Simple CLI

For quick questions without the TUI:

# One-off questions
consoul ask "Explain Python decorators"

# Interactive chat mode
consoul chat

🔌 SDK Integration

Embed AI capabilities in your Python applications:

from consoul import Consoul

# Enable tools for file operations and code search
console = Consoul(tools=True)

# Stateful conversation
console.chat("List all TODO comments in Python files")
console.chat("Create a summary.md file with the results")

# Rich responses with metadata
response = console.ask("Summarize this project", show_tokens=True)
print(f"Tokens: {response.tokens}")
print(f"Cost: ${console.last_cost['total_cost']:.4f}")

Documentation

📖 Full Documentation

Getting Started:

TUI:

Tools:

SDK:


Configuration

Create ~/.consoul/config.yaml:

# Default profile
profiles:
  default:
    model:
      provider: anthropic
      model: claude-3-5-sonnet-20241022
      temperature: 0.7

    tools:
      enabled: true
      permission_policy: balanced  # Require approval for risky operations

    conversation:
      save_history: true
      max_history: 50

  # Local model profile
  local:
    model:
      provider: ollama
      model: llama3.2:latest

Switch profiles:

consoul --profile local

See the Configuration Guide for all options.


Security

Consoul includes comprehensive security controls for tool execution:

  • Permission Policies - PARANOID, BALANCED, TRUSTING, UNRESTRICTED
  • Approval Workflows - Interactive confirmation for dangerous operations
  • Audit Logging - Complete execution history in JSONL format
  • Command Validation - Pattern-based blocking of risky commands

See the Security Policy for best practices.


Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.


License

MIT License - see LICENSE file for details.


Built With


Made with ❤️ by GoatBytes.IO

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

consoul-0.4.2.tar.gz (459.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

consoul-0.4.2-py3-none-any.whl (564.5 kB view details)

Uploaded Python 3

File details

Details for the file consoul-0.4.2.tar.gz.

File metadata

  • Download URL: consoul-0.4.2.tar.gz
  • Upload date:
  • Size: 459.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.4.2.tar.gz
Algorithm Hash digest
SHA256 40082f77a252c01f5d21ea958733f85ad3b9169426e1fb2bd0dd7895000146d1
MD5 b5e179e1b9ea16969158157236feec2e
BLAKE2b-256 8e6fe40f21f3e27bb73ab49bf04cf5513b906d80aa1f790ac241a3e142bf6d2b

See more details on using hashes here.

File details

Details for the file consoul-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: consoul-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 564.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Darwin/24.6.0

File hashes

Hashes for consoul-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 fc786c8f9eafd77ba34cb1697939400ad1fca0a25c45b2dd042171c9da32f163
MD5 3712a8dc718950cc062aa40ffc2464f1
BLAKE2b-256 c3d0ed862bcf1d2ec6c27a868a658a4f41b1d7c7ac97d4c5ea72f6d3a52e4cb9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page