AI-powered bash command generator
Project description
terminal-sherpa
A lightweight AI chat interface for fellow terminal dwellers.
Turn natural language into bash commands instantly. Stop googling syntax and start asking.
🚀 Getting Started
Get up and running:
# Install terminal-sherpa
pip install terminal-sherpa # installs the `ask` CLI tool
# Set your API key
export ANTHROPIC_API_KEY="your-key-here"
# Try it out
ask "find all .py files modified in the last week"
Example output:
find . -name "*.py" -mtime -7
✨ Features
- Natural language to bash conversion - Describe what you want, get the command
- Multiple AI provider support - Choose between Anthropic (Claude), OpenAI (GPT), Google (Gemini), xAI (Grok) models, and local models via Ollama
- Flexible configuration system - Set defaults, customize models, and manage API keys
- XDG-compliant config files - Follows standard configuration file locations
- Verbose logging support - Debug and understand what's happening under the hood
📦 Installation
Requirements
- Python 3.9+
- API key for Anthropic, OpenAI, Google, or xAI (or local Ollama installation)
Install Methods
Using pip:
pip install terminal-sherpa
From source:
git clone https://github.com/lcford2/terminal-sherpa.git
cd terminal-sherpa
uv sync
uv run ask "your prompt here"
Verify installation:
ask --help
💡 Usage
Basic Syntax
ask "your natural language prompt"
Command Options
| Option | Description | Example |
|---|---|---|
--model provider:model |
Specify provider and model | ask --model anthropic "list files" |
ask --model anthropic:sonnet "list files" |
||
ask --model openai "list files" |
||
ask --model gemini "list files" |
||
ask --model gemini:pro "list files" |
||
ask --model grok "list files" |
||
ask --model ollama "list files" |
||
ask --model ollama:codellama "list files" |
||
--verbose |
Enable verbose logging | ask --verbose "compress this folder" |
Practical Examples
File Operations:
ask "find all files larger than 100MB"
# Example output: find . -size +100M
ask "create a backup of config.txt with timestamp"
# Example output: cp config.txt config.txt.$(date +%Y%m%d_%H%M%S)
Git Commands:
ask "show git log for last 5 commits with one line each"
# Example output: git log --oneline -5
ask "delete all local branches that have been merged"
# Example output: git branch --merged | grep -v "\*\|main\|master" | xargs -n 1 git branch -d
System Administration:
ask "check disk usage of current directory sorted by size"
# Example output: du -sh * | sort -hr
ask "find processes using port 8080"
# Example output: lsof -i :8080
Text Processing:
ask "count lines in all Python files"
# Example output: find . -name "*.py" -exec wc -l {} + | tail -1
ask "replace all tabs with spaces in file.txt"
# Example output: sed -i 's/\t/ /g' file.txt
Network Operations:
ask "download file from URL and save to downloads folder"
# Example output: curl -o ~/Downloads/filename "https://example.com/file"
ask "check if port 443 is open on example.com"
# Example output: nc -zv example.com 443
⚙️ Configuration
Configuration File Locations
Ask follows XDG Base Directory Specification:
$XDG_CONFIG_HOME/ask/config.toml~/.config/ask/config.toml(if XDG_CONFIG_HOME not set)~/.ask/config.toml(fallback)
Environment Variables
export ANTHROPIC_API_KEY="your-anthropic-key"
export OPENAI_API_KEY="your-openai-key"
export GEMINI_API_KEY="your-gemini-key"
export XAI_API_KEY="your-xai-key"
Example Configuration File
Create ~/.config/ask/config.toml:
[ask]
default_model = "anthropic"
[anthropic]
model = "claude-3-haiku-20240307"
max_tokens = 512
[anthropic.sonnet]
model = "claude-sonnet-4-20250514"
max_tokens = 1024
[openai]
model = "gpt-4o"
max_tokens = 1024
[gemini]
model = "gemini-2.5-flash-lite-preview-06-17"
max_tokens = 150
[gemini.pro]
model = "gemini-2.5-pro"
max_tokens = 1024
[grok]
model = "grok-3-fast"
max_tokens = 150
temperature = 0.5
[ollama]
model = "llama3.2"
host = "localhost"
port = 11434
[ollama.codellama]
model = "codellama"
🤖 Supported Providers
- Anthropic (Claude)
- OpenAI (GPT)
- Google (Gemini)
- xAI (Grok)
- Ollama (Local Models)
Note: Get API keys from Anthropic Console, OpenAI Platform, Google AI Studio, or xAI Console
Local Models with Ollama
For local inference without API costs:
- Install Ollama: Visit ollama.ai for installation instructions
- Pull a model:
ollama pull llama3.2 - Start Ollama:
ollama serve(if not auto-started) - Use with ask:
ask --model ollama "your prompt"
Example:
ollama pull codellama
ask --model ollama:codellama "optimize this bash script"
🛣️ Roadmap
- Shell integration and auto-completion
- Additional providers (Cohere, Mistral)
- Additional local model support (llama.cpp)
🔧 Development
Setup
git clone https://github.com/lcford2/terminal-sherpa.git
cd ask
uv sync --all-groups
uv run pre-commit install
Testing
uv run python -m pytest
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Run pre-commit checks:
uv run pre-commit run --all-files - Run tests:
uv run task test - Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Issues
Found a bug or have a feature request? Please open an issue on GitHub Issues.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file terminal_sherpa-0.5.1.tar.gz.
File metadata
- Download URL: terminal_sherpa-0.5.1.tar.gz
- Upload date:
- Size: 143.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7409d19fc5b0bfbd7e666b05c1a7d315e5b8ae595f7101e8c3cf447639161fcc
|
|
| MD5 |
c484dc3d7a95c73627818677e413e87f
|
|
| BLAKE2b-256 |
b40e5dd88cd2dd10fc14d658e9b5e029884cc4459a23cc7eeaafe4715edd4ae7
|
Provenance
The following attestation bundles were made for terminal_sherpa-0.5.1.tar.gz:
Publisher:
publish.yml on lcford2/terminal-sherpa
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
terminal_sherpa-0.5.1.tar.gz -
Subject digest:
7409d19fc5b0bfbd7e666b05c1a7d315e5b8ae595f7101e8c3cf447639161fcc - Sigstore transparency entry: 298378168
- Sigstore integration time:
-
Permalink:
lcford2/terminal-sherpa@ce7c3b94bfc0f9996b821901ad5f69c9b080b329 -
Branch / Tag:
refs/tags/v0.5.1 - Owner: https://github.com/lcford2
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ce7c3b94bfc0f9996b821901ad5f69c9b080b329 -
Trigger Event:
release
-
Statement type:
File details
Details for the file terminal_sherpa-0.5.1-py3-none-any.whl.
File metadata
- Download URL: terminal_sherpa-0.5.1-py3-none-any.whl
- Upload date:
- Size: 34.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a0c79ed678f40ab54e6ff5ede809e576bdf755f1ca9bad1c83f9193c0282effb
|
|
| MD5 |
51fd7263c93e5ac7243fd149ac0ccace
|
|
| BLAKE2b-256 |
4094dc9cd3de97581e07e166b3c3aa4b7c91b0fd1781e9fb11829dcab0c7832e
|
Provenance
The following attestation bundles were made for terminal_sherpa-0.5.1-py3-none-any.whl:
Publisher:
publish.yml on lcford2/terminal-sherpa
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
terminal_sherpa-0.5.1-py3-none-any.whl -
Subject digest:
a0c79ed678f40ab54e6ff5ede809e576bdf755f1ca9bad1c83f9193c0282effb - Sigstore transparency entry: 298378175
- Sigstore integration time:
-
Permalink:
lcford2/terminal-sherpa@ce7c3b94bfc0f9996b821901ad5f69c9b080b329 -
Branch / Tag:
refs/tags/v0.5.1 - Owner: https://github.com/lcford2
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ce7c3b94bfc0f9996b821901ad5f69c9b080b329 -
Trigger Event:
release
-
Statement type: