Skip to main content

PIP-BOY LLM - Vault-Tec Local AI Terminal

Project description

PIP-BOY LLM

Vault-Tec Local AI Terminal - Run local LLMs with a retro Fallout PIP-BOY themed interface.

Installation

pip install pip-boy-llm

Optional Dependencies

# For 4-bit quantization (Mistral 7B)
pip install pip-boy-llm[quantization]

# For PDF file support
pip install pip-boy-llm[pdf]

# For Windows readline support
pip install pip-boy-llm[readline]

# Install all optional dependencies
pip install pip-boy-llm[all]

Quick Start

  1. Run the setup wizard (recommended for first-time users):

    pip-boy-setup
    
  2. Start the terminal:

    pip-boy-llm
    
  3. Select a model and start chatting!

Models

Model Description Requirements
Gemma 3 1B Fast, lightweight HuggingFace login
Llama 3.2 1B Fast, good quality HuggingFace login + license
Mistral 7B Best quality, 4-bit bitsandbytes (optional)

HuggingFace Login

Some models require HuggingFace authentication:

  1. Create an account at huggingface.co
  2. Generate a token at huggingface.co/settings/tokens
  3. Run pip-boy-setup and enter your token

License Agreements

Llama and Gemma models require accepting license agreements:

Commands

Command Description
/help Show available commands
/exit Quit the terminal
/clear Clear conversation history
/reset Reset the AI model
/model Show current model info
@filepath Include file contents in message

File References

Include file contents in your messages using @:

> Explain this code: @main.py
> Summarize these files: @src/app.py @src/utils.py
> Review @"path with spaces/file.py"

Commands Reference

pip-boy-llm

Main terminal interface. Select a model and start chatting.

pip-boy-llm

pip-boy-setup

Setup wizard for first-time configuration:

  • Checks dependencies (PyTorch, Transformers, etc.)
  • Configures HuggingFace authentication
  • Verifies model access
  • Creates config directory
pip-boy-setup

pip-boy-update

Check for package updates:

pip-boy-update

Configuration

Config files are stored in ~/.airllm/:

  • config.yaml - User preferences
  • history.yaml - Chat history

System Requirements

  • Python 3.9+
  • CUDA-capable GPU recommended (CPU mode available)
  • 4GB+ VRAM for 1B models
  • 8GB+ VRAM for Mistral 7B (4-bit)

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pip_boy_llm-1.0.3.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pip_boy_llm-1.0.3-py3-none-any.whl (25.7 kB view details)

Uploaded Python 3

File details

Details for the file pip_boy_llm-1.0.3.tar.gz.

File metadata

  • Download URL: pip_boy_llm-1.0.3.tar.gz
  • Upload date:
  • Size: 23.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.3.tar.gz
Algorithm Hash digest
SHA256 34da120d057b6e616b21d1baa90976ee2f6f1c57cba4c8184db26df60ad7dccb
MD5 4445713e4c7acd280f6ddf4cd563223c
BLAKE2b-256 ee9b911e7c847aae1e28ddd04aa6038c79efc6f139c2538dc08ea3393c60c515

See more details on using hashes here.

File details

Details for the file pip_boy_llm-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: pip_boy_llm-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 25.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for pip_boy_llm-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5e99249ae8db3143942eed4f29366a64aed396f964a91e1984efa9827e261f48
MD5 e8bdf62b6b729eb0800006f6b1e0256d
BLAKE2b-256 e12d078261554a32e019c15b982fe49dcbcd0b3fe20ddb516ce493ee7c0e46eb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page