Skip to main content

A CLI tool for getting quick command-line suggestions using any LLM potentially available

Project description

Quick Question (qq)

A command-line tool that suggests and executes terminal commands using various LLM providers. It prioritizes local LLM providers for privacy and cost efficiency, with fallback to cloud providers when configured. This tool is still under development and initally tested in MacOS, we'll continue expanding as we receive feedback, please feel free to reach out with ideas and feedback to cv@southbrucke.com

Features

  • Multiple LLM provider support:
    • Local providers (prioritized):
      • LM Studio
      • Ollama
    • Cloud providers (requires API keys):
      • OpenAI
      • Anthropic
      • Groq
  • Interactive command selection
  • Command history tracking
  • Configurable settings
  • Copy to clipboard or direct execution options
  • macOS-optimized command suggestions

Installation

pip install qq2

Usage

Basic command:

qq "your question here"

Configure settings:

qq --settings

View command history:

qq

Provider Selection

The tool follows this priority order for LLM providers:

  1. Checks for running local providers (LM Studio or Ollama)
  2. If no local providers are available, checks for configured cloud provider API tokens
  3. Uses the first available provider unless a specific default is set in settings

Configuration

Use qq --settings to configure:

  • Default provider selection
  • Command action (execute or copy to clipboard)
  • Default model for each provider
  • API keys for cloud providers

Environment Variables

Cloud provider API keys can be set via environment variables:

  • OPENAI_API_KEY - for OpenAI
  • ANTHROPIC_API_KEY - for Anthropic
  • GROQ_API_KEY - for Groq

Examples

Get file search commands:

qq "how do I search for files containing specific text"

Find process information:

qq "show me all running processes containing 'python'"

Requirements

  • Python >= 3.6
  • macOS (optimized for macOS terminal commands)
  • Local LLM provider (LM Studio or Ollama) or cloud provider API key

License

Proprietary - All rights reserved

Author

Cristian Vyhmeister (cv@southbrucke.com)

For more information, visit https://southbrucke.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qq2-0.1.1.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qq2-0.1.1-py3-none-any.whl (13.3 kB view details)

Uploaded Python 3

File details

Details for the file qq2-0.1.1.tar.gz.

File metadata

  • Download URL: qq2-0.1.1.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.0

File hashes

Hashes for qq2-0.1.1.tar.gz
Algorithm Hash digest
SHA256 76e4d983d881e6511f486f4bd7a70c6ab13efcb5af064931f1517a4b39902cc4
MD5 4d4bee319c50befd8a8b0eab70994160
BLAKE2b-256 d5bc50e64368e46b5faf8a708c14d9e167b99afb037e4ef82468d15be26392d1

See more details on using hashes here.

File details

Details for the file qq2-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: qq2-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 13.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.0

File hashes

Hashes for qq2-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 641a16a448dde9087c1c180d519aea84922942286a57621c62cbd0a3273961a7
MD5 718569823bc567a7c9a7b4c4512752ac
BLAKE2b-256 3bcbc25f9d587b183b262263e489e8263b6ab5e25eff996e2cd0a2a4d90ad4d6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page