Skip to main content

PIP-BOY LLM - Vault-Tec Local AI Terminal

Project description

PIP-BOY LLM

Vault-Tec Local AI Terminal - Run local LLMs with a retro Fallout PIP-BOY themed interface.

Installation

pip install pip-boy-llm

Optional Dependencies

# For 4-bit quantization (Mistral 7B)
pip install pip-boy-llm[quantization]

# For PDF file support
pip install pip-boy-llm[pdf]

# For Windows readline support
pip install pip-boy-llm[readline]

# Install all optional dependencies
pip install pip-boy-llm[all]

Quick Start

  1. Run the setup wizard (recommended for first-time users):

    pip-boy-setup
    
  2. Start the terminal:

    pip-boy-llm
    
  3. Select a model and start chatting!

Models

Model Description Requirements
Gemma 3 1B Fast, lightweight HuggingFace login
Llama 3.2 1B Fast, good quality HuggingFace login + license
Mistral 7B Best quality, 4-bit bitsandbytes (optional)

HuggingFace Login

Some models require HuggingFace authentication:

  1. Create an account at huggingface.co
  2. Generate a token at huggingface.co/settings/tokens
  3. Run pip-boy-setup and enter your token

License Agreements

Llama and Gemma models require accepting license agreements:

Commands

Command Description
/help Show available commands
/exit Quit the terminal
/clear Clear conversation history
/reset Reset the AI model
/model Show current model info
@filepath Include file contents in message

File References

Include file contents in your messages using @:

> Explain this code: @main.py
> Summarize these files: @src/app.py @src/utils.py
> Review @"path with spaces/file.py"

Commands Reference

pip-boy-llm

Main terminal interface. Select a model and start chatting.

pip-boy-llm

pip-boy-setup

Setup wizard for first-time configuration:

  • Checks dependencies (PyTorch, Transformers, etc.)
  • Configures HuggingFace authentication
  • Verifies model access
  • Creates config directory
pip-boy-setup

pip-boy-update

Check for package updates:

pip-boy-update

Configuration

Config files are stored in ~/.airllm/:

  • config.yaml - User preferences
  • history.yaml - Chat history

System Requirements

  • Python 3.9+
  • CUDA-capable GPU recommended (CPU mode available)
  • 4GB+ VRAM for 1B models
  • 8GB+ VRAM for Mistral 7B (4-bit)

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pip_boy_llm-1.0.6.tar.gz (23.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pip_boy_llm-1.0.6-py3-none-any.whl (26.1 kB view details)

Uploaded Python 3

File details

Details for the file pip_boy_llm-1.0.6.tar.gz.

File metadata

  • Download URL: pip_boy_llm-1.0.6.tar.gz
  • Upload date:
  • Size: 23.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.6.tar.gz
Algorithm Hash digest
SHA256 2ca2bdae6aa6c1144fb93bd4f892aa8270d58caccc29abdb09e80d31a4979866
MD5 fe4dc40f1793efd8a6b287abab74ae23
BLAKE2b-256 d63a30eb1b03fa470290bff3bea3c2c9b4c0fec668953552b73d5d06d6303d26

See more details on using hashes here.

File details

Details for the file pip_boy_llm-1.0.6-py3-none-any.whl.

File metadata

  • Download URL: pip_boy_llm-1.0.6-py3-none-any.whl
  • Upload date:
  • Size: 26.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 58f6a2df69d0f123ed1056956f011a1004ace49f7178c6633c59a1f0d7bf3da5
MD5 1452222ae3654def49b2bcf57652bf22
BLAKE2b-256 c31746da17be2f1772749786d174a69b6bd4c9a7911ebde6330e625d8ae65865

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page