Skip to main content

An AI-based CLI assistant to help you with command line & shell.

Project description

how-cli

An AI-based CLI assistant to help you with command line & shell.

Demo

https://github.com/user-attachments/assets/effefe1a-c0ed-4b60-838c-98f992f6c25f

Installation

1. Using pip

  • Ensure python is installed on your system. (Tested against Python 3.11+)
  • Install the package using pip.
pip install -U how-cli

2. Manual Installation

  • Clone the repository & cd into it
git clone https://github.com/FireHead90544/how-cli && cd how-cli
  • Ensure you're in a virtual environment.
  • Install the application.
pip install -e .

Usage

$ how [OPTIONS] COMMAND [ARGS]...

Options:

  • -v, --version: Shows the version of the application
  • --install-completion: Install completion for the current shell.
  • --show-completion: Show completion for the current shell, to copy it or customize the installation.
  • --help: Show this message and exit.

Commands:

  • setup: Sets up the configuration required to run...
  • to: Sends the task to the LLM for analysis.

how setup

Sets up the configuration required to run the application. Set the LLM Provider & the corresponding API Key.

Usage:

$ how setup [OPTIONS]

Options:

  • --interactive / --no-interactive: Whether to use interactive mode for setting up configuration? [default: interactive]
  • --provider TEXT: The LLM Provider, needs to be passed explicitly if using --no-interactive mode.
  • --api-key TEXT: The API Key for the LLM provider, needs to be passed explicitly if using --no-interactive mode.
  • --help: Show this message and exit.

how to

Sends the task to the LLM for analysis. Returns the commands to be executed in order to achieve that.

Usage:

$ how to [OPTIONS] TASK

Arguments:

  • TASK: The command line task to perform. [required]

Options:

  • --help: Show this message and exit.

Providers

how-cli uses ChatModels as they support chat messages as opposed to TextModels and below model providers and their corresponding models are available to use. If you could test the models that are marked as ❌, please create an issue or pull request along with the test results.

Provider Model Package Tested
GoogleGenAI gemini-1.5-flash langchain-google-genai
GoogleVertexAI gemini-1.5-flash langchain-google-vertexai
GroqMistralAI mixtral-8x7b-32768 langchain-groq
GroqLLaMa llama3-70b-8192 langchain-groq
OpenAI gpt-4o langchain-openai
Anthropic claude-3-5-sonnet-20240620 langchain-anthropic

License

how-cil is licensed under the MIT License, it can be found here.

Honourable Mentions

This project is greatly inspired by kynnyhsap's how. Though my implementation is completely different (refer to the below image for architectural details), but at the core both the projects aims to do the same thing. Also, check out LangChain & Typer using which this project was built.

arch

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

how_cli-0.2.0.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

how_cli-0.2.0-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file how_cli-0.2.0.tar.gz.

File metadata

  • Download URL: how_cli-0.2.0.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for how_cli-0.2.0.tar.gz
Algorithm Hash digest
SHA256 22a8676e273360ac65f00ef7a00399698e661957956269da2452f6a7e2bf67ad
MD5 cb32181d289a6116c16a3633f88b8653
BLAKE2b-256 78c6d5a4b949ba6bd75594e88926462f66b7e33111edb0c87e67c13fb1005026

See more details on using hashes here.

File details

Details for the file how_cli-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: how_cli-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for how_cli-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 894d071a76d21f01d67ac7ab51d6e7f031986a6feee0fc3d12397ae740a513ea
MD5 176c688ad8b45e5be3ed5a533678f7b3
BLAKE2b-256 90ae302774407f3489500fecdf94fa9f1a393110dbcd1dec0a08f675b55431d1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page