Skip to main content

A Language Model Enhanced Command Line Interface

Project description

LLM-Shell: Language Model Enhanced Command Line Interface

Overview

LLM-Shell is a command-line interface (CLI) tool that enhances your shell experience with the power of large language models (LLMs) such as GPT-4 and GPT-3.5 Turbo. It acts as a wrapper around your standard shell, allowing you to execute regular shell commands while also providing the capability to consult an LLM for programming assistance, code examples, and executing commands with natural language understanding.

Features

  • Execute standard shell commands with real-time output.
  • Use language models to process commands described in natural language.
  • Syntax highlighting for code blocks returned by the language model.
  • Set a context file to provide additional information to the LLM.
  • Change the underlying LLM backend (e.g., GPT-4 Turbo, GPT-4, GPT-3.5 Turbo).
  • Autocompletion for custom commands and file paths.
  • History tracking of commands and LLM responses.

Prerequisites

  • Python 3
  • requests library for making HTTP requests to the LLM API.
  • pygments library for syntax highlighting.
  • An API key from OpenAI for accessing their language models.

Installation

  1. Ensure you have Python 3 installed on your system.
  2. Install the required Python packages:
pip install requests pygments
  1. Clone the repository or download the llm-shell.py script to your local machine.
  2. Make sure the script is executable:
   chmod +x llm-shell.py
  1. Set your OpenAI API key as an environment variable CHATGPT_API_KEY or within a .env file that the script can read.

Usage

To start the LLM-Shell, simply run the llm-shell.py script:

./llm-shell.py

Executing Commands

  • Standard shell commands are executed as normal, e.g., ls -la.
  • To use the LLM, prefix your command with a hash #, followed by the natural language instruction, e.g., # How do I list all files in the current directory?.

Special Commands

  • help - Displays a list of available custom commands within the LLM-Shell.
  • set-llm [backend] - Changes the LLM backend. Replace [backend] with one of the supported backends (currently gpt-4-turbo, gpt-4, and gpt-3.5-turbo).
  • context [filename] - Sets a context file that will be used to provide additional information to the LLM. Use context none to clear the context file.
  • exit - Exits the LLM-Shell.

Autocompletion

  • The LLM-Shell supports autocompletion for file paths and custom commands. Press Tab to autocomplete the current input.

Customization

Modify the llm-shell.py script to add new features or change existing behavior to better suit your needs.

License

LLM-Shell is released under the MIT License. See the LICENSE file for more information.

Disclaimer

LLM-Shell is not an official product and is not affiliated with OpenAI. It's just an open-source tool developed to showcase the integration of LLMs into a command-line environment. Use it at your own risk.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-shell-0.1.0.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

llm_shell-0.1.0-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file llm-shell-0.1.0.tar.gz.

File metadata

  • Download URL: llm-shell-0.1.0.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for llm-shell-0.1.0.tar.gz
Algorithm Hash digest
SHA256 bfbf23f237961f5443e37d9820403fb8c984f84f3c7d169c8125e5c2d27e4232
MD5 397f5c48fd450d17ffe06730cbc29b03
BLAKE2b-256 21c69c4dfba7928825478eb0fa914f491fe288b56b17d40ea14f616cc4f9578e

See more details on using hashes here.

Provenance

File details

Details for the file llm_shell-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: llm_shell-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for llm_shell-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f626b87d8b7e5b75929f42fd031a6e3bfb8c5bdb868c99b304225c6919c4f29d
MD5 70d09914825d8c330cec30d180cdc043
BLAKE2b-256 0c40b8a99220fe2c76a2dc40623611250d7a807013c44e2db011cf1e60ab03b9

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page