Skip to main content

Explains your terminal errors in plain English. Fully local, no API key.

Project description

๐Ÿ”๏ธ Sherpa

Explains your terminal errors in plain English. Fully local, no API key.

Python 3.10+ License: MIT PyPI


Sherpa Demo

$ python app.py
TypeError: unsupported operand type(s) for +: 'int' and 'str'  [line 42]

$ sherpa

sherpa is thinking...

โ•ญโ”€ Why it failed โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚ You're trying to add an integer and a string at line 42.     โ”‚
โ”‚ Python requires both sides of + to be the same type โ€”        โ”‚
โ”‚ it won't auto-convert like JavaScript does.                  โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ
โ•ญโ”€ Fix โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚ total + int(user_input)                                      โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ

Install

pip install sherpa-dev

Or from source:

git clone https://github.com/RishiiGamer2201/sherpa
cd sherpa
pip install -e .

First Run

sherpa

On first run, Sherpa will prompt you to download a local AI model (~4GB). After that, everything runs offline โ€” no internet, no API key, no external server. Ever.

Usage

# Explain last terminal error (default)
sherpa

# Explain a specific line in a file
sherpa explain app.py:42

# Ask a freeform question
sherpa ask why is my API returning 403 only in production

# Show current config
sherpa cfg show

# Switch to a different model
sherpa cfg set-model /path/to/custom-model.gguf

Why Sherpa?

Every developer hits errors in their terminal every day. The usual workflow:

  1. Read the error โ†’ feel confused
  2. Copy the error โ†’ open browser โ†’ Google/ChatGPT โ†’ read results โ†’ come back

That's a context switch. You leave your flow, lose your mental state, and waste 3โ€“5 minutes on something that should take 5 seconds.

Sherpa eliminates that loop. The explanation and fix come to you, right where the error happened.

๐Ÿ”’ Your code never leaves your machine. Sherpa runs entirely locally using a quantized AI model. No data is sent anywhere. Ever.

How It Works

sherpa (you type this)
  โ”‚
  โ”œโ”€ config.py    โ†’ checks if model exists
  โ”œโ”€ setup.py     โ†’ downloads model on first run
  โ”œโ”€ history.py   โ†’ reads last command + stderr from shell history
  โ”œโ”€ ai.py        โ†’ loads local model, runs inference
  โ””โ”€ display.py   โ†’ prints explanation + fix with rich styling
Component Library Why
CLI click Clean command routing, auto help text
Output rich Colors, panels, syntax highlighting, progress bars
AI llama-cpp-python Runs .gguf models inline, no server needed
Model CodeLlama 7B Q4 Code-optimized, ~4GB, runs on CPU with 8GB RAM

Supported Models

Model Size Best for
codellama-7b-instruct.Q4_K_M.gguf 4GB Default โ€” code-specific, fast
deepseek-coder-6.7b.Q4_K_M.gguf 4GB Slightly better on debug tasks
mistral-7b-instruct.Q4_K_M.gguf 4GB Good general fallback
gemma-2b-it.Q4_K_M.gguf 1.6GB Low RAM machines (4GB or less)
llama3.2-3b-instruct.Q4_K_M.gguf 2GB Fast, decent quality, mid-range

Switch models anytime:

sherpa cfg set-model /path/to/model.gguf

Comparison

Tool Leaves Terminal? Explains Why? Works Offline? Needs API Key?
Sherpa โŒ No โœ… Yes โœ… Yes โŒ No
Stack Overflow โœ… Yes Sometimes โŒ No โ€”
ChatGPT / Claude โœ… Yes โœ… Yes โŒ No โœ… Yes
GitHub Copilot N/A (IDE) โœ… Yes โŒ No โœ… Yes
thefuck โŒ No โŒ No โœ… Yes โŒ No

Supported Shells

  • โœ… Bash
  • โœ… Zsh
  • โœ… Fish

Requirements

  • Python 3.10+
  • 8GB RAM (for default 7B model)
  • ~4GB disk space for the model

Contributing

See CONTRIBUTING.md for open tasks and guidelines.

License

MIT


Built with Python and llama-cpp-python. Fully local. Your code never leaves your machine.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sherpa_dev-0.1.2.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sherpa_dev-0.1.2-py3-none-any.whl (13.8 kB view details)

Uploaded Python 3

File details

Details for the file sherpa_dev-0.1.2.tar.gz.

File metadata

  • Download URL: sherpa_dev-0.1.2.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for sherpa_dev-0.1.2.tar.gz
Algorithm Hash digest
SHA256 0f4a5878d1ead201db926bef5ece0dec6c0be0fe709eb9fd7dfcf651112fc7b0
MD5 a60c8963a9c6e16242ae58883a1fe526
BLAKE2b-256 be17bc2db31a9018285c0e3d4c8a36d2611141f4730d9edaafa3f46a1c8ddc73

See more details on using hashes here.

File details

Details for the file sherpa_dev-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: sherpa_dev-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 13.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for sherpa_dev-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ad31364b791a291eed8222d9940c1a247569029dfc6dd602b68e08b3affb9dee
MD5 da90574adb764beb828b6357de2b1c53
BLAKE2b-256 487efe37728e48cf206f416696c48e5e04adc79dfaba4363009d4eece3dfd0bc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page