Skip to main content

PIP-BOY LLM - Vault-Tec Local AI Terminal

Project description

PIP-BOY LLM

Vault-Tec Local AI Terminal - Run local LLMs with a retro Fallout PIP-BOY themed interface.

Installation

pip install pip-boy-llm

Optional Dependencies

# For 4-bit quantization (Mistral 7B)
pip install pip-boy-llm[quantization]

# For PDF file support
pip install pip-boy-llm[pdf]

# For Windows readline support
pip install pip-boy-llm[readline]

# Install all optional dependencies
pip install pip-boy-llm[all]

Quick Start

  1. Run the setup wizard (recommended for first-time users):

    pip-boy-setup
    
  2. Start the terminal:

    pip-boy-llm
    
  3. Select a model and start chatting!

Models

Model Description Requirements
Gemma 3 1B Fast, lightweight HuggingFace login
Llama 3.2 1B Fast, good quality HuggingFace login + license
Mistral 7B Best quality, 4-bit bitsandbytes (optional)

HuggingFace Login

Some models require HuggingFace authentication:

  1. Create an account at huggingface.co
  2. Generate a token at huggingface.co/settings/tokens
  3. Run pip-boy-setup and enter your token

License Agreements

Llama and Gemma models require accepting license agreements:

Commands

Command Description
/help Show available commands
/exit Quit the terminal
/clear Clear conversation history
/reset Reset the AI model
/model Show current model info
@filepath Include file contents in message

File References

Include file contents in your messages using @:

> Explain this code: @main.py
> Summarize these files: @src/app.py @src/utils.py
> Review @"path with spaces/file.py"

Commands Reference

pip-boy-llm

Main terminal interface. Select a model and start chatting.

pip-boy-llm

pip-boy-setup

Setup wizard for first-time configuration:

  • Checks dependencies (PyTorch, Transformers, etc.)
  • Configures HuggingFace authentication
  • Verifies model access
  • Creates config directory
pip-boy-setup

pip-boy-update

Check for package updates:

pip-boy-update

Configuration

Config files are stored in ~/.airllm/:

  • config.yaml - User preferences
  • history.yaml - Chat history

System Requirements

  • Python 3.9+
  • CUDA-capable GPU recommended (CPU mode available)
  • 4GB+ VRAM for 1B models
  • 8GB+ VRAM for Mistral 7B (4-bit)

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pip_boy_llm-1.0.5.tar.gz (23.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pip_boy_llm-1.0.5-py3-none-any.whl (26.1 kB view details)

Uploaded Python 3

File details

Details for the file pip_boy_llm-1.0.5.tar.gz.

File metadata

  • Download URL: pip_boy_llm-1.0.5.tar.gz
  • Upload date:
  • Size: 23.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.5.tar.gz
Algorithm Hash digest
SHA256 578969012d7604e5dde39019fe57562b28d042da0f72aaeca522ffa5c2082a7a
MD5 d3f4d83bcd18d8deb99d1d451b1916d7
BLAKE2b-256 1d9a936781d73e7b9f744767cc3f0f552bc78d60c4837fec8f3181e47a4536be

See more details on using hashes here.

File details

Details for the file pip_boy_llm-1.0.5-py3-none-any.whl.

File metadata

  • Download URL: pip_boy_llm-1.0.5-py3-none-any.whl
  • Upload date:
  • Size: 26.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 e007496481f3f3d39c350c62625bf7c76c6e60471916e8e64148803eeb310dd9
MD5 b9f3c1d0535d8128c627fcbe6aaf461d
BLAKE2b-256 a98361fe48187dfcab836c2170628e88db886d84988154948c5ddea890c9d642

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page