Skip to main content

A natural-language to shell/Python CLI assistant using local Ollama models.

Project description


llamaline v1.0.0
MIT License
Author: Luke Steuber
Web: actuallyusefulai.com, lukesteuber.com

๐Ÿฆ™ llamaline

A natural-language to shell/Python CLI assistant using local Ollama models.

Transform your everyday tasks into simple English commands! llamaline bridges the gap between natural language and code execution, making command-line operations accessible to everyone.

โœจ Features

  • ๐Ÿ—ฃ๏ธ Natural Language Processing: Type commands in plain English
  • ๐Ÿ›ก๏ธ Safety First: Confirmation prompts and unsafe operation blocking
  • ๐ŸŽจ Rich Interface: Colorized output with syntax highlighting
  • โšก Quick Commands: Built-in cheat sheets for common tasks
  • ๐Ÿ”„ Model Flexibility: Switch between Ollama models on-the-fly
  • ๐ŸŽฏ Accessibility: Full keyboard navigation, screen reader compatible
  • ๐Ÿ”ง Developer Friendly: Easy installation and configuration

Installation

Via Conda (Recommended)

conda install -c conda-forge llamaline

Via Pip

git clone https://github.com/lukeslp/llamaline.git
cd llamaline
pip install .

Development Installation

git clone https://github.com/lukeslp/llamaline.git
cd llamaline
pip install -e .

๐Ÿš€ Quick Start

Single Command Execution

llamaline "Show me disk usage"
llamaline "List all Python files in this directory"
llamaline "What's my current memory usage?"

Interactive Mode

llamaline

Then type natural language commands:

  • disk usage โ†’ df -h
  • running processes โ†’ ps aux
  • say hello โ†’ print('Hello, world!')
  • list files โ†’ ls -al

Built-in Commands

  • help - Show available commands
  • cheats - List all cheat sheet shortcuts
  • model - Show current Ollama model
  • model llama2 - Switch to different model
  • quit - Exit the application

๐ŸŽฏ Example Sessions

System Administration:

> memory usage
Code to execute: vm_stat
Execute this? [Y/n]: y
=== Bash Output ===
Pages free:                   123456.
Pages active:                 234567.
...

File Management:

> show me all log files larger than 1MB
Code to execute: find . -name "*.log" -size +1M -ls
Execute this? [Y/n]: y
=== Bash Output ===
drwxr-xr-x    1 user  staff   2048 Dec 19 10:30 ./app.log
...

Accessibility

  • The CLI uses colorized output for clarity, but all prompts are also readable as plain text.
  • All commands are available via keyboard navigation.
  • No mouse interaction is required.

๐Ÿ“‹ Requirements

  • Python 3.7+
  • Local Ollama server running with at least one model installed
    • Install Ollama: https://ollama.com
    • Recommended model: ollama pull gemma3:4b
    • Or any compatible model you prefer

โš™๏ธ Configuration

Environment Variables

export OLLAMA_ENDPOINT="http://localhost:11434"  # Default
export OLLAMA_MODEL="gemma3:4b"                  # Default

Command Line Options

llamaline -e http://localhost:11434 -m llama2 "your command"

๐Ÿ›  Development

Development Installation

git clone https://github.com/lukeslp/llamaline.git
cd llamaline
pip install -e .

Development Scripts

The scripts/ folder contains helpful automation scripts:

# Test package build and functionality
./scripts/test_package.sh

# Create GitHub release (requires git tag)
./scripts/release.sh

# Build conda package (requires conda-build)
./scripts/build_conda.sh

Project Structure

llamaline/
โ”œโ”€โ”€ llamaline/
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ””โ”€โ”€ llamaline.py      # Main CLI module
โ”œโ”€โ”€ scripts/
โ”‚   โ”œโ”€โ”€ build_conda.sh    # Conda package building
โ”‚   โ”œโ”€โ”€ release.sh        # GitHub release automation
โ”‚   โ””โ”€โ”€ test_package.sh   # Package validation testing
โ”œโ”€โ”€ conda-recipe/
โ”‚   โ”œโ”€โ”€ meta.yaml         # Traditional conda recipe
โ”‚   โ””โ”€โ”€ recipe.yaml       # Modern conda-forge recipe
โ”œโ”€โ”€ pyproject.toml        # Package configuration
โ”œโ”€โ”€ requirements.txt      # Dependencies
โ”œโ”€โ”€ PROJECT_PLAN.md       # Roadmap and architecture
โ””โ”€โ”€ README.md            # This file

Contributing

  • See PROJECT_PLAN.md for roadmap and contribution guidelines
  • Follow accessibility best practices
  • Include tests for new features
  • Update documentation as needed

๐Ÿ”’ Safety & Security

llamaline includes several safety features:

  • Command confirmation before execution
  • Unsafe operation blocking (prevents sudo, rm -rf, etc.)
  • Temporary file execution for Python code
  • No persistent state between commands

๐ŸŒŸ Community & Support

Having fun with llamaline? We'd love to hear from you!

Connect With Us Link
๐Ÿ› Issues & Features GitHub Issues
๐Ÿ› ๏ธ Source Code GitHub Repository
๐Ÿ“ง Email luke@lukesteuber.com
๐Ÿฆ Bluesky @lukesteuber.com
๐Ÿ’ผ LinkedIn lukesteuber
โœ‰๏ธ Newsletter Substack
โ˜• Support Tip Jar

๐Ÿ“„ License

Licensed under the MIT License by Luke Steuber. See LICENSE for details.


Made with โค๏ธ for the accessibility community

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llamaline-1.0.2.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llamaline-1.0.2-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file llamaline-1.0.2.tar.gz.

File metadata

  • Download URL: llamaline-1.0.2.tar.gz
  • Upload date:
  • Size: 13.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for llamaline-1.0.2.tar.gz
Algorithm Hash digest
SHA256 7ba8d961750215fcefba7e9fd749afd7406e94fa01054810eb4986591671cf48
MD5 6085c397daf041ebf9404067ace52f37
BLAKE2b-256 6e43acb06730e7c004a40268fe7de4bf9c70f866996bbbf3c4b96ea80f0c86a5

See more details on using hashes here.

File details

Details for the file llamaline-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: llamaline-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 8.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for llamaline-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0676e97505e220609d5e7671121f880e838537402ca3a575b7f656b731888e9b
MD5 0bd7972d27272ccbcbe00d66306244e7
BLAKE2b-256 edc97295b7fe0cd9d4c436c6cdfed691344b5d7a871bd35a0b662f6e4f49e303

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page