Skip to main content

An AI-based CLI assistant to help you with command line & shell.

Project description

how-cli

An AI-based CLI assistant to help you with command line & shell.

Demo

Would be adding install instructions and better demo soon, until then check this out.

Setup

image

Inferences

image image

Installation

1. Using pip

  • Ensure python is installed on your system. (Tested against Python 3.11+)
  • Install the package using pip.
pip install -U how-cli

2. Manual Installation

  • Clone the repository & cd into it
git clone https://github.com/FireHead90544/how-cli && cd how-cli
  • Ensure you're in a virtual environment.
  • Install the application.
pip install -e .

Usage

$ how [OPTIONS] COMMAND [ARGS]...

Options:

  • -v, --version: Shows the version of the application
  • --install-completion: Install completion for the current shell.
  • --show-completion: Show completion for the current shell, to copy it or customize the installation.
  • --help: Show this message and exit.

Commands:

  • setup: Sets up the configuration required to run...
  • to: Sends the task to the LLM for analysis.

how setup

Sets up the configuration required to run the application. Set the LLM Provider & the corresponding API Key.

Usage:

$ how setup [OPTIONS]

Options:

  • --interactive / --no-interactive: Whether to use interactive mode for setting up configuration? [default: interactive]
  • --provider TEXT: The LLM Provider, needs to be passed explicitly if using --no-interactive mode.
  • --api-key TEXT: The API Key for the LLM provider, needs to be passed explicitly if using --no-interactive mode.
  • --help: Show this message and exit.

how to

Sends the task to the LLM for analysis. Returns the commands to be executed in order to achieve that.

Usage:

$ how to [OPTIONS] TASK

Arguments:

  • TASK: The command line task to perform. [required]

Options:

  • --help: Show this message and exit.

Providers

how-cli uses ChatModels as they support chat messages as opposed to TextModels and below model providers and their corresponding models are available to use.

Provider Model Package
Google gemini-1.5-flash langchain-google-genai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

how_cli-0.1.0.tar.gz (8.4 kB view hashes)

Uploaded Source

Built Distribution

how_cli-0.1.0-py3-none-any.whl (9.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page