Skip to main content

A powerful shell that's powered by a locally running LLM

Project description

llamashell

llamashell is a powerful shell that's powered by a locally running LLM. We have tested it with Llama 3.2, Qwen 2.5, and Gemma 3.

Features

  • Interactive Shell: Execute standard shell commands like cd, ls, and more, with support for pipes (|), input redirection (<), and output redirection (> or >>).
  • LLM Integration: Interact with an LLM (default: meta-llama/Llama-3.2-1B-Instruct) for assistance using the -- prefix (e.g., -- write me an inspirational quote).
  • Command History: Persistent command history stored in ~/.llamashell_history.
  • Chat Log Management: Save and view LLM conversation logs with commands like --save-chat-logs and --view-chat-logs.
  • File Operations: Read files into the LLM context with --read <filename> and save individual LLM responses with --save <filename>.
  • Auto-Completion: Basic command and file auto-completion for a smoother user experience.
  • Cross-Platform: Supports GPU acceleration (CUDA/MPS) and CPU fallback for broad compatibility.

Installation

pip3 install llamashell

Prerequisites

  • Python 3.11+
  • Linux or MacOS

Usage

llamashell

You can provide any instruct LLM you can find on hugging face. For example:

llamashell --model "Qwen/Qwen2.5-0.5B-Instruct"

or

llamashell --model "google/gemma-3-1b-it"

Special Commands

  • -- <message>: Send a message to the LLM.
  • --save-chat-logs: Save the entire LLM conversation to a file.
  • --save [filename]: Save the last LLM response to a file.
  • --view-chat-logs: Display the LLM conversation history.
  • --read <filename>: Read a file and add its contents to the LLM context.
  • --clear: Reset the LLM chat session.
  • history: Show the shell command history.
  • exit, quit, bye: Exit the shell.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llamashell-0.0.5.tar.gz (20.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llamashell-0.0.5-py3-none-any.whl (20.0 kB view details)

Uploaded Python 3

File details

Details for the file llamashell-0.0.5.tar.gz.

File metadata

  • Download URL: llamashell-0.0.5.tar.gz
  • Upload date:
  • Size: 20.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for llamashell-0.0.5.tar.gz
Algorithm Hash digest
SHA256 cab122c51ef18a6434c099d5d5ac5e5ede8b3613c57b8793fdca00c19c57cf0a
MD5 376bb9232f0857e087020ef5ef32cd9e
BLAKE2b-256 d80b161a4621fd5eca1f4c8524963bef20c1bfc4263650d9bb8318ff13b89d98

See more details on using hashes here.

File details

Details for the file llamashell-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: llamashell-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 20.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for llamashell-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 015f9d1475d727a0c88dee5f9ddabf173dd7e1afd5e7a1fb17fef8ac98cdd5b3
MD5 3d4257ac6903eee027734f1202471982
BLAKE2b-256 ba1bb230eb966f202cceb6ff2b5165d132b9fd1327396fea0ec438bb81d78478

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page