Skip to main content

A CLI tool for getting quick command-line suggestions using any LLM potentially available

Project description

Quick Question (qq)

A powerful, cross-platform CLI tool that generates and executes terminal commands using 100+ LLM providers through LiteLLM integration. It intelligently prioritizes local models for privacy and falls back to cloud providers when configured.

PyPI version Python License

๐Ÿš€ Key Features

Universal LLM Support (100+ Providers via LiteLLM)

  • Local Providers (Privacy-first, no API keys):

    • Ollama (port 11434) - Run open-source models locally
    • LM Studio (port 1234) - GUI-based local model management
  • Major Cloud Providers:

    • OpenAI (GPT-4o, GPT-5, ChatGPT models)
    • Anthropic (Claude 3.5 Sonnet/Haiku/Opus)
    • Google (Gemini, PaLM)
    • Amazon Bedrock
    • Azure OpenAI
    • Groq (Fast inference)
    • Grok (xAI)
  • Specialized Providers (via LiteLLM):

    • Cohere, Replicate, Hugging Face
    • Together AI, Anyscale, Perplexity
    • DeepInfra, AI21, Voyage AI
    • And 80+ more providers!

Intelligent Features

  • โšก Smart Provider Selection: Automatically detects and uses available providers
  • ๐ŸŽฏ Model Optimization: Selects best models based on availability and performance
  • ๐Ÿ“ Command History: Track and replay previous commands
  • ๐ŸŽจ Rich Interactive UI: Beautiful terminal interface with Textual TUI
  • ๐Ÿ“‹ Clipboard Integration: Copy or type commands directly
  • ๐Ÿ”ง Developer Mode: Extensible framework for custom actions
  • ๐Ÿš„ Simple Mode: Streamlined one-shot command generation
  • ๐Ÿ’พ Smart Caching: 1-hour TTL for providers and models

๐Ÿ“ฆ Installation

From PyPI (Stable)

pip install qq

From Test PyPI (Latest Features)

pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ qq2

From Source (Development)

git clone https://github.com/yourusername/quickquestion.git
cd quickquestion
pip install -e .

๐ŸŽฏ Quick Start

Basic Usage

# Get a command suggestion
qq "find all large files over 100MB"

# Simple mode - instant command (no UI)
qq --simple "kill process on port 8080"

# Type command directly to terminal
qq --simple-type "list docker containers"

Configuration

# Interactive settings (Rich UI)
qq --settings

# Advanced configuration (Textual TUI)
qq --config

# View command history
qq --history

# Developer mode
qq --dev

โš™๏ธ Configuration Options

Interactive Settings (qq --settings)

Navigate with arrow keys through:

  1. Default Provider - Choose from available providers
  2. Default Model - Select model for chosen provider
  3. Command Action - Run or Copy commands
  4. Simple Mode - Enable/disable streamlined mode
  5. Simple Mode Action - Copy or Type behavior

Advanced Config (qq --config)

Beautiful Textual TUI with tabs:

  • Quick Setup - Same as --settings but in modern UI
  • Providers - Browse and configure 100+ providers
  • Settings - General application settings
  • About - Version and documentation

Settings are persisted in ~/.qq_settings.json

๐Ÿ”Œ Provider Setup

Local Providers (No API Key Required)

Ollama

# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# Pull a model
ollama pull llama2

# qq will auto-detect Ollama on port 11434
qq "your question"

LM Studio

  1. Download from lmstudio.ai
  2. Load any GGUF model
  3. Start local server (port 1234)
  4. qq auto-detects LM Studio

Cloud Providers

OpenAI

export OPENAI_API_KEY="sk-..."
qq "your question"

Anthropic

export ANTHROPIC_API_KEY="sk-ant-..."
qq "your question"

Other Providers

qq supports 100+ providers through LiteLLM. Set the appropriate environment variable:

export GROQ_API_KEY="..."
export XAI_API_KEY="..."  # For Grok
export GEMINI_API_KEY="..."
export COHERE_API_KEY="..."
# etc.

๐ŸŽจ Usage Examples

Command Generation

# File operations
qq "find files modified today"
qq "compress all images in current directory"

# System management
qq "show memory usage by process"
qq "find what's using port 3000"

# Git operations
qq "undo last commit keeping changes"
qq "show commits by author in last week"

# Docker/Kubernetes
qq "remove all stopped containers"
qq "get pod logs from last hour"

Simple Mode (No UI)

# Copy to clipboard
qq --simple-copy "create python virtual environment"
# โœ“ Copied: python -m venv venv

# Type to terminal
qq --simple-type "activate virtual environment"
# source venv/bin/activate [appears in terminal]

Developer Mode

qq --dev
# Access specialized developer actions and workflows

๐Ÿ› ๏ธ Advanced Features

Custom Developer Actions

Create ~/QuickQuestion/CustomDevActions/my_action.py:

from quickquestion.dev_actions.base import DevAction

class MyAction(DevAction):
    @property
    def name(self) -> str:
        return "My Custom Action"
    
    @property
    def description(self) -> str:
        return "Does something special"
    
    def execute(self) -> bool:
        self.console.print("[green]Executing...[/green]")
        # Your logic here
        return True

Performance Optimizations

  • Async Provider Detection: Parallel checking for fastest startup
  • Smart Caching: 1-hour TTL for providers, 30-second for other data
  • Lazy Loading: Deferred initialization in simple mode
  • Model Prioritization: Automatic selection of optimal models

Debugging

# Enable debug output
qq --debug "your question"

# Clear provider cache
qq --clear-cache

๐Ÿ“ File Locations

  • ~/.qq_settings.json - User preferences
  • ~/.qq_history.json - Command history (last 100)
  • ~/.qq_cache.json - Provider and model cache
  • ~/QuickQuestion/CustomDevActions/ - Custom actions

๐Ÿ”ง Troubleshooting

Provider Not Detected

# Clear cache and re-detect
qq --clear-cache
qq --settings  # Reconfigure

API Key Issues

# Verify environment variable
echo $OPENAI_API_KEY

# Set in shell profile
echo 'export OPENAI_API_KEY="sk-..."' >> ~/.bashrc

SSL Certificate Errors

# macOS-specific fix
export CERT_PATH=$(python -m certifi)
export SSL_CERT_FILE="$CERT_PATH"

๐Ÿšข CI/CD & Deployment

GitHub Actions / Gitea Actions

The project includes automated workflows for:

  • Testing on push/PR
  • Publishing to PyPI on version tags
  • Separate Test PyPI (qq2) and Production PyPI (qq) releases

Manual Deployment

# Build
python -m build

# Test locally
pip install dist/qq-*.whl

# Upload to PyPI
twine upload dist/*

๐Ÿ“Š Architecture

quickquestion/
โ”œโ”€โ”€ qq.py              # Main entry point and CLI
โ”œโ”€โ”€ llm_lite_provider.py  # LiteLLM integration (100+ providers)
โ”œโ”€โ”€ settings_manager.py   # Configuration management
โ”œโ”€โ”€ ui_library.py      # Rich terminal UI components  
โ”œโ”€โ”€ cache.py           # TTL-based caching system
โ”œโ”€โ”€ provider_registry.py  # Provider catalog and metadata
โ”œโ”€โ”€ config_app.py      # Textual TUI for configuration
โ””โ”€โ”€ dev_actions/       # Developer mode actions

๐ŸŒŸ What's New in v0.2.0

  • LiteLLM Integration: Support for 100+ LLM providers
  • Provider Registry: Organized catalog of all providers
  • Textual TUI: Modern configuration interface (--config)
  • GPT-5 Support: Compatible with latest OpenAI models
  • Enhanced Caching: Improved performance and reliability
  • CI/CD Pipeline: Automated testing and deployment
  • Bug Fixes: Provider persistence, model selection, and more

๐Ÿ—บ๏ธ Roadmap

  • Web UI for configuration
  • Plugin system for extensions
  • Multi-command workflows
  • Command explanation mode
  • Integration with shell history
  • Homebrew formula
  • Docker image
  • VSCode extension

๐Ÿ“„ License

Proprietary - All rights reserved. See LICENSE file.

๐Ÿ’ฌ Support & Contact

๐Ÿ™ Acknowledgments

  • Built with LiteLLM for universal LLM support
  • UI powered by Rich and Textual
  • Thanks to all contributors and users!

Quick Question - Your AI-powered command line companion ๐Ÿš€

southbrucke.com | Documentation | PyPI

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qq-0.2.1.tar.gz (62.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qq-0.2.1-py3-none-any.whl (73.4 kB view details)

Uploaded Python 3

File details

Details for the file qq-0.2.1.tar.gz.

File metadata

  • Download URL: qq-0.2.1.tar.gz
  • Upload date:
  • Size: 62.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.2

File hashes

Hashes for qq-0.2.1.tar.gz
Algorithm Hash digest
SHA256 cf6703d0ba267d6f6ca4fc75cd0b58f86069a58f07e6fa92264a076c810bb4fa
MD5 db4f28602cc1950d5cc4daefe76991fd
BLAKE2b-256 5b58759a3c4d8e37a5e6bf6980df1306746d701b66dd90eed9d9bb8ac4373d78

See more details on using hashes here.

File details

Details for the file qq-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: qq-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 73.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.2

File hashes

Hashes for qq-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 82f1732d495d5fcc1ac6f2e253a055fa50485060a974edb8ea7ca52ec7e073d9
MD5 04c55cd3bc941df6c9bc1906658e6c64
BLAKE2b-256 963172f81d5c97aaee282ddcbbb3baf04759b74b787133da642aaf0f070acc1b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page