Natural Language Shell - AI-augmented command line interface
Project description
nlsh - Natural Language Shell
An AI-augmented command line shell that wraps your existing shell and adds natural language command generation via LLM.
Features
- Shell Compatibility: Works with any shell (bash, zsh, fish, sh)
- Natural Language Commands: Use
llm: <prompt>to generate shell commands from natural language - Rich Context: LLM has access to current directory, file listings, shell info, and environment
- Command History: Full SQLite-based history tracking of all commands and LLM interactions
- Safety First: Always prompts for confirmation before executing AI-generated commands
- Rich Terminal Output: Beautiful terminal interface powered by Rich
- Streaming Responses: Real-time streaming with animated progress spinners for tool calls
- Tool-Enabled AI: AI can execute shell commands, read files, and gather system information with confirmation
Installation
Prerequisites
- Python 3.9 or higher
- Anthropic API key (preferred) or OpenAI API key
uvpackage manager (recommended) or pip
Using uv (recommended)
# Clone the repository
git clone <repository-url>
cd nlsh
# Install with uv
uv pip install -e .
# Or install dependencies directly
uv pip install rich openai typer python-dotenv
Using pip
pip install -e .
Configuration
- Copy the example environment file:
cp .env.example .env
- Add your LLM API key(s) to
.env:
Anthropic Claude (Preferred)
ANTHROPIC_API_KEY=your_anthropic_api_key_here
ANTHROPIC_MODEL=claude-3-5-sonnet-20241022 # Optional: default model
OpenAI (Alternative/Fallback)
OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4o-mini # Optional: default model
Note: nlsh will prefer Anthropic if both keys are available. Anthropic's Claude models often provide better tool usage and reasoning for shell tasks.
- Install optional Anthropic support (if using Anthropic):
# With Poetry
poetry install --extras anthropic
# With pip
pip install langchain-anthropic
Usage
Starting nlsh
nlsh
This will start the natural language shell interface with streaming enabled by default.
Command Line Options
nlsh --help # Show all available options
nlsh --no-stream # Disable streaming (faster but less interactive)
nlsh --use-simple # Use simple OpenAI interface instead of LangGraph
nlsh --debug # Enable debug mode
Commands
Regular Shell Commands
Just type commands normally - they'll be passed through to your default shell:
bitchin-shell $ ls -la
bitchin-shell $ cd src
bitchin-shell $ git status
Natural Language Commands
Use the llm: prefix for AI-generated commands:
nlsh $ llm: find all python files larger than 1MB
nlsh $ llm: show me git commits from last week
nlsh $ llm: compress all jpg files in this directory
nlsh $ llm: install the requests package
The AI will:
- Proactively use tools to understand your environment (directory contents, git status, system info, etc.)
- Analyze your request and current context (working directory, files, shell type)
- Gather additional information if needed (with animated progress indicators)
- Generate appropriate shell commands based on discovered information
- Show you the commands and ask for confirmation
- Execute the commands if you approve
Exiting
bitchin-shell $ exit
# or
bitchin-shell $ quit
How It Works
Streaming Interface
nlsh features a modern streaming interface that shows:
- 🔄 Animated spinners for tool operations (file reading, directory listing, etc.)
- 📁 Real-time tool call descriptions ("Reading file: config.py", "Checking git status")
- 💬 Streaming AI responses as they're generated
- ✅ Tool completion confirmations with result previews
Tool-Enabled AI
The AI has access to powerful tools and uses them proactively:
- 📁 File Operations: List, read, and find files
- ⚡ Shell Commands: Execute commands with mandatory confirmation
- 📊 Git Integration: Check status, view logs, and diffs
- 💻 System Info: Access environment and system details
- 🌳 Directory Trees: Navigate and understand project structure
Key Feature: The AI automatically uses these tools when relevant - you don't need to explicitly ask it to "check files" or "use git status". It proactively gathers information to provide better responses and commands.
Context Awareness
The LLM receives rich context about your environment:
- Current working directory and file listings
- Shell information (type, version, features)
- System information (OS, architecture)
- Environment variables (PATH, HOME, etc.)
- Recent command history
- Session conversation history for long-running interactions
Shell Detection
nlsh automatically detects your shell from the $SHELL environment variable and adapts:
- bash/zsh: Standard POSIX syntax
- fish: Fish-specific syntax and features
- sh: Basic POSIX compatibility mode
History Tracking
All interactions are stored in SQLite database (~/.nlsh/history.db):
- Shell commands and their output
- LLM interactions (prompts, generated commands, execution results)
- Context snapshots
- Session information
Session History Awareness
nlsh maintains awareness of your current session's history to enable natural, long-running conversations:
- Previous commands and their results are included in the LLM context
- Past AI interactions help the AI understand conversation flow
- Failed commands are remembered to avoid repeating mistakes
- Successful patterns can be referenced in follow-up requests
This enables commands like:
nlsh $ llm: find all large log files
# AI finds files and shows commands
nlsh $ llm: now compress those files from the previous search
# AI remembers the previous search results and can reference them
Examples
File Operations
nlsh $ llm: find all Python files modified in the last 24 hours
# Generates: find . -name "*.py" -mtime -1
nlsh $ llm: backup all my config files to a tar archive
# Generates: tar -czf config_backup_$(date +%Y%m%d).tar.gz ~/.bashrc ~/.vimrc ~/.gitconfig
Development Tasks
nlsh $ llm: run the tests and show coverage
# Generates: python -m pytest --cov=. --cov-report=term-missing
nlsh $ llm: start a local web server on port 8080
# Generates: python -m http.server 8080
System Administration
nlsh $ llm: show me processes using more than 100MB of memory
# Generates: ps aux | awk '$6 > 102400 {print $0}' | sort -k6 -nr
nlsh $ llm: check disk usage and show largest directories
# Generates: du -h --max-depth=1 | sort -hr
Architecture
nlsh/
├── src/nlsh/
│ ├── cli.py # Main CLI interface and REPL loop
│ ├── shell.py # Shell detection and command execution
│ ├── llm.py # OpenAI integration and command generation
│ ├── context.py # Environment and filesystem context gathering
│ └── history.py # SQLite-based history management
├── pyproject.toml # Python package configuration
└── README.md
Development
Running from Source
# Install in development mode
uv pip install -e .
# Run directly
python -m nlsh.cli
# Or run the installed command
nlsh
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
Troubleshooting
Common Issues
"OPENAI_API_KEY environment variable is required"
- Make sure you've created a
.envfile with your OpenAI API key - Verify the key is correct and has sufficient credits
"Shell detection failed"
- nlsh falls back to
shif it can't detect your shell - You can manually set
SHELLenvironment variable if needed
"Permission denied" errors
- Make sure nlsh has permission to create the history database in
~/.nlsh/ - Check file permissions in your working directory
Debug Mode
Run with debug flag to see detailed error information:
nlsh --debug
Roadmap
Phase 2 Features (Planned)
- Follow-up Commands: "rerun that but with sudo", "do the same for .js files"
- Trusted Commands: Auto-execute safe commands without confirmation
- Plugin System: Extensible AI tools and integrations
- Better Shell Integration: Tab completion, command substitution
Phase 3 Features (Future)
- Full Shell Replacement: Built-in shell features, not just wrapper
- Advanced Context: Git repository awareness, project type detection
- Collaborative Sessions: Shared session history and context
- Custom LLM Models: Support for local/custom models
License
[License information]
Credits
Built with:
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nlsh_ai-0.1.7.tar.gz.
File metadata
- Download URL: nlsh_ai-0.1.7.tar.gz
- Upload date:
- Size: 31.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bd55a69879a3e986841f7ea52856a616caaf6ef852dcad99831d12ccdf85ea08
|
|
| MD5 |
215a50095944d6f9bdb8d80b718fb8b9
|
|
| BLAKE2b-256 |
727f5e764bc2b53c4ee83a9974cbdbd32d71d068c46580714b9aa766d77a9c48
|
File details
Details for the file nlsh_ai-0.1.7-py3-none-any.whl.
File metadata
- Download URL: nlsh_ai-0.1.7-py3-none-any.whl
- Upload date:
- Size: 33.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b9e44e2a3c8c374fac29c465e0bba1dd25ab580fba3af91e1e08a01149008250
|
|
| MD5 |
b40c9be81cbef983e2d063d3b0490b23
|
|
| BLAKE2b-256 |
8ef51494575933928ba3f151f0f528ba41bf2d64987ae402c92d0c2c2e13e2f6
|