🦙 LLaMA: Open and Efficient Foundation Language Models in A Single GPU
Project description
🦙 LLaMA - Run LLM in A Single GPU
📢
pyllama
is a hacked version ofLLaMA
based on original Facebook's implementation but more convenient to run in a Single consumer grade GPU.
Setup
In a conda env with pytorch / cuda available, run
pip install pyllama
Single GPU Inference
Set the environment variables CKPT_DIR
as your llamm model folder, for example /llama_data/7B
, and TOKENIZER_PATH
as your tokenizer's path, such as /llama_data/tokenizer.model
.
And then run the following command:
python inference.py --ckpt_dir $CKPT_DIR --tokenizer_path $TOKENIZER_PATH
The following is an example of LLaMA running in a 8GB single GPU.
Tips
-
To load KV cache in CPU, run
export KV_CAHCHE_IN_GPU=0
in the shell. -
To profile CPU/GPU/Latency, run:
python inference_driver.py --ckpt_dir $CKPT_DIR --tokenizer_path $TOKENIZER_PATH
A sample result is like:
Multiple GPU Inference
The provided example.py
can be run on a single or multi-gpu node with torchrun
and will output completions for two pre-defined prompts. Using TARGET_FOLDER
as defined in download.sh
:
torchrun --nproc_per_node MP example.py --ckpt_dir $TARGET_FOLDER/model_size --tokenizer_path $TARGET_FOLDER/tokenizer.model
Different models require different MP values:
Model | MP |
---|---|
7B | 1 |
13B | 2 |
30B | 4 |
65B | 8 |
Download
In order to download the checkpoints and tokenizer, fill this google form
Once your request is approved, you will receive links to download the tokenizer and model files.
Edit the download.sh
script with the signed url provided in the email to download the model weights and tokenizer.
Model Card
See MODEL_CARD.md
License
See the LICENSE file.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for pyllama-0.0.2.dev2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a629a29c77c73bee77bf81c66ed03437617233661a6efc6a6420faa646a134a1 |
|
MD5 | 4202058ab0eb78757af0ef93870fc1a7 |
|
BLAKE2b-256 | 060fe1bde4b40eb4c8af86f2d68e446c5f175ad25fd2871414e8d37f7eca92e3 |