Skip to main content

No project description provided

Project description

Shell-AI: let AI write your shell commands

PyPI version License: MIT Forks Stars

Shell-AI (shai) is a CLI utility that brings the power of natural language understanding to your command line. Simply input what you want to do in natural language, and shai will suggest single-line commands that achieve your intent. Under the hood, Shell-AI leverages the LangChain for LLM use and builds on the excellent InquirerPy for the interactive CLI.

demo-shell-ai

Installation

You can install Shell-AI directly from PyPI using pip:

pip install shell-ai

Note that on Linux, Python 3.10 or later is required.

After installation, you can invoke the utility using the shai command.

Usage

To use Shell-AI, open your terminal and type:

shai run terraform dry run thingy

Shell-AI will then suggest 3 commands to fulfill your request:

  • terraform plan
  • terraform plan -input=false
  • terraform plan

Features

  • Natural Language Input: Describe what you want to do in plain English (or other supported languages).
  • Command Suggestions: Get single-line command suggestions that accomplish what you asked for.
  • Cross-Platform: Works on Linux, macOS, and Windows.
  • Azure Compatibility: Shell-AI now supports Azure OpenAI deployments.

Configuration

Shell-AI can be configured through environment variables or a config file located at ~/.config/shell-ai/config.json (Linux/MacOS) or %APPDATA%\shell-ai\config.json (Windows).

Environment Variables

  • OPENAI_API_KEY: (Required) Your OpenAI API key, leave empty if you use ollama
  • OPENAI_MODEL: The OpenAI model to use (default: "gpt-3.5-turbo")
  • OPENAI_API_BASE: The OpenAI API / OpenAI compatible API endpoint to use (default: None)
  • GROQ_API_KEY: (Required if using Groq) Your Groq API key
  • SHAI_SUGGESTION_COUNT: Number of suggestions to generate (default: 3)
  • SHAI_SKIP_CONFIRM: Skip command confirmation when set to "true"
  • SHAI_SKIP_HISTORY: Skip writing to shell history when set to "true"
  • SHAI_API_PROVIDER: Choose between "openai", "ollama", "azure", or "groq" (default: "groq")
  • SHAI_TEMPERATURE: Controls randomness in the output (default: 0.05). Lower values (e.g., 0.05) make output more focused and deterministic, while higher values (e.g., 0.7) make it more creative and varied.
  • CTX: Enable context mode when set to "true" (Note: outputs will be sent to the API)
  • OLLAMA_MODEL: The Ollama model to use (default: "phi3.5")
  • OLLAMA_API_BASE: The Ollama endpoint to use (default: "http://localhost:11434/v1/")

Config File Example

{
  "OPENAI_API_KEY": "your_openai_api_key_here",
  "OPENAI_MODEL": "gpt-3.5-turbo",
  "SHAI_SUGGESTION_COUNT": "3",
  "CTX": true
}

Config Example for OpenAI compatible

{
   "SHAI_API_PROVIDER": "openai",
   "OPENAI_API_KEY": "deepseek_api_key",
   "OPENAI_API_BASE": "https://api.deepseek.com",
   "OPENAI_MODEL": "deekseek-chat",
   "SHAI_SUGGESTION_COUNT": "3",
   "SHAI_SUGGESTION_COUNT": "3",
   "CTX": true
}

Config example for MistralAI

{
   "SHAI_API_PROVIDER": "mistral",
   "MISTRAL_API_KEY": "mistral_api_key",
   "MISTRAL_API_BASE": "https://api.mistral.ai/v1",
   "MISTRAL_MODEL": "codestral-2508",
   "SHAI_SUGGESTION_COUNT": "3",
   "CTX": true
}

Config Example for Ollama

   {
   "OPENAI_API_KEY":"",
   "SHAI_SUGGESTION_COUNT": "3",
   "SHAI_API_PROVIDER": "ollama",
   "OLLAMA_MODEL": "phi3.5",
   "OLLAMA_API_BASE": "http://localhost:11434/v1/",
   "SHAI_TEMPERATURE": "0.05"
   }

The application will read from this file if it exists, overriding any existing environment variables.

Run the application after setting these configurations.

Using with Groq

To use Shell AI with Groq:

  1. Get your API key from Groq
  2. Set the following environment variables:
    export SHAI_API_PROVIDER=groq
    export GROQ_API_KEY=your_api_key_here
    export GROQ_MODEL=llama-3.3-70b-versatile
    

Contributing

This implementation can be made much smarter! Contribute your ideas as Pull Requests and make AI Shell better for everyone.

Contributions are welcome! Please read the CONTRIBUTING.md for guidelines.

License

Shell-AI is licensed under the MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shell_ai-0.4.4.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

shell_ai-0.4.4-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file shell_ai-0.4.4.tar.gz.

File metadata

  • Download URL: shell_ai-0.4.4.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for shell_ai-0.4.4.tar.gz
Algorithm Hash digest
SHA256 5d27ee5e44c8fc8f8b258bc8908b2d3af4c62bf87bfa10cfadd058f3f4bc6a92
MD5 926f4582ee2ffd9852ecbb5650ee6ef4
BLAKE2b-256 5fa39e5a19d1c47a535723db2fd9ecbcb1532a257583ff28304ddc8fdc345b7e

See more details on using hashes here.

File details

Details for the file shell_ai-0.4.4-py3-none-any.whl.

File metadata

  • Download URL: shell_ai-0.4.4-py3-none-any.whl
  • Upload date:
  • Size: 11.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for shell_ai-0.4.4-py3-none-any.whl
Algorithm Hash digest
SHA256 cb7a89c1889c762ef072759b985c9d30a92c4b62e65fa19bcca6446ef7586193
MD5 01adb2979c94c76a7f889ca4ecc1f53e
BLAKE2b-256 6f4e011f0e858315a2df229ad6cb93bcfd0a57440fad3e23bce648d203a4886d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page