Skip to main content

PIP-BOY LLM - Vault-Tec Local AI Terminal

Project description

PIP-BOY LLM

Vault-Tec Local AI Terminal - Run local LLMs with a retro Fallout PIP-BOY themed interface.

Installation

pip install pip-boy-llm

Optional Dependencies

# For 4-bit quantization (Mistral 7B)
pip install pip-boy-llm[quantization]

# For PDF file support
pip install pip-boy-llm[pdf]

# For Windows readline support
pip install pip-boy-llm[readline]

# Install all optional dependencies
pip install pip-boy-llm[all]

Quick Start

  1. Run the setup wizard (recommended for first-time users):

    pip-boy-setup
    
  2. Start the terminal:

    pip-boy-llm
    
  3. Select a model and start chatting!

Models

Model Description Requirements
Gemma 3 1B Fast, lightweight HuggingFace login
Llama 3.2 1B Fast, good quality HuggingFace login + license
Mistral 7B Best quality, 4-bit bitsandbytes (optional)

HuggingFace Login

Some models require HuggingFace authentication:

  1. Create an account at huggingface.co
  2. Generate a token at huggingface.co/settings/tokens
  3. Run pip-boy-setup and enter your token

License Agreements

Llama and Gemma models require accepting license agreements:

Commands

Command Description
/help Show available commands
/exit Quit the terminal
/clear Clear conversation history
/reset Reset the AI model
/model Show current model info
@filepath Include file contents in message

File References

Include file contents in your messages using @:

> Explain this code: @main.py
> Summarize these files: @src/app.py @src/utils.py
> Review @"path with spaces/file.py"

Commands Reference

pip-boy-llm

Main terminal interface. Select a model and start chatting.

pip-boy-llm

pip-boy-setup

Setup wizard for first-time configuration:

  • Checks dependencies (PyTorch, Transformers, etc.)
  • Configures HuggingFace authentication
  • Verifies model access
  • Creates config directory
pip-boy-setup

pip-boy-update

Check for package updates:

pip-boy-update

Configuration

Config files are stored in ~/.airllm/:

  • config.yaml - User preferences
  • history.yaml - Chat history

System Requirements

  • Python 3.9+
  • CUDA-capable GPU recommended (CPU mode available)
  • 4GB+ VRAM for 1B models
  • 8GB+ VRAM for Mistral 7B (4-bit)

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pip_boy_llm-1.0.8.tar.gz (23.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pip_boy_llm-1.0.8-py3-none-any.whl (26.3 kB view details)

Uploaded Python 3

File details

Details for the file pip_boy_llm-1.0.8.tar.gz.

File metadata

  • Download URL: pip_boy_llm-1.0.8.tar.gz
  • Upload date:
  • Size: 23.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.8.tar.gz
Algorithm Hash digest
SHA256 55f7d4e23b84bb4ed0cc1ea09397f92048de8bfa33ea736b5f5f5af2cba88524
MD5 5c7f10ff61f045863fe4454ebaed0b75
BLAKE2b-256 a735034b297ad6e9557530108435b06743f7ac5aa8edf292361247b51155dfc1

See more details on using hashes here.

File details

Details for the file pip_boy_llm-1.0.8-py3-none-any.whl.

File metadata

  • Download URL: pip_boy_llm-1.0.8-py3-none-any.whl
  • Upload date:
  • Size: 26.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 1fcf8edbdcad926f62d3cd018d10b2d4daa8cc37cd626cc5f2bfd674011b3218
MD5 8874d84c81c53e2d4be7ef4d8955031e
BLAKE2b-256 da54a12d2db4711ad78c1ee52b945ec2f994b2706399a703b0de36989051560d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page