Microlib for converting LLM weights into sepweight format
Project description
llm_sepweight
The llm_sepweight
microlib is designed to manage the weights of large language models (LLMs) by organizing them into
a unified format called sepweight
.
Every LLM has roughly the same three parts:
begin
- the part of the model which computes the embeddings before the layersmid
- a number of (most commonly transformer) layersend
- the part of the model which converts the hidden state into a prediction for the next token
sepweight
essentially mirrors the state dict of the LLM into the filesystems, meaning that you will (roughly) have one
file per each of these components of the LLM.
This format enables the distributed execution of LLMs by separating the model weights into distinct segments that can be individually managed and accessed as needed.
This microlib provides methods for:
dump
-ing from different formats tosepweight
load
-ing the state dict needed to run a part of the LLM
The only dependency is torch
.
Microlib docs are available on https://microlib.org/llm_sepweight.html
Installation
pip install llm_sepweight
Quick Example
To convert an existing state dict into sepweight
, you need to provide:
decider
is a function which will be called for each key in the state dict, and has to decide whether that key should be part of thebegin
,mid
, orend
section and the the new name of the key. Examplestate_dict
- is just your usual PyTorch state dictout_path
is the directory, in which you want the result to be stored.
import llm_sepweight
llm_sepweight.dump(
decider=decider,
state_dict=state_dict,
out_path=out_path
)
You could have multiple state dicts (for example coming from multiple files), it's ok to call dump_to_directory
with
each of them. The result will be combined state dict of all the state dicts provided for a given out_path
.
Goal format
llm_sepweight
allows you to convert different formats to its own directory format, which is very simple.
Let's have a look at an example:
├── begin.pth
├── end.pth
├── mid.00000.pth
├── mid.00001.pth
├── mid.00002.pth
├── mid.00003.pth
├── mid.00004.pth
└── mid.00005.pth
All the weights are stored in a directory in usual .pth
files.
This format is very simple and allows great flexibility. For example, a node running layers 0 to 3 would only need to
download the begin
, mid.00000
, mid.00001
, mid.00000
files.
Why do we need it?
There are all sorts of different formats for storing the weights of an LLM - .pth
files, safetensors
, H5
,
arrow
, GGUF
, etc.
Moreover, there is a lot of difference in the naming of the transformer layers, of the start embedding, and of the final head.
llm_sepweight
aims to provide functions, through which you can convert different formats into a sepweight
format.
The sepweight
format is a unified, simple format that allows you to treat the weights of all LLMs in the same way
when running nodes in distributed way.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llm_sepweight-0.10.0.tar.gz
.
File metadata
- Download URL: llm_sepweight-0.10.0.tar.gz
- Upload date:
- Size: 9.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c80e97eee5bcce922858d3124cc87d92eb2b40ec67d2f1292dd579b09daaa02e |
|
MD5 | 43ddfa7898b46c5540a86c33d02302f4 |
|
BLAKE2b-256 | 19ab738fda6830e0a7d03a52a41eb1916bfd00719ecff3e704d110488a6f925e |
File details
Details for the file llm_sepweight-0.10.0-py3-none-any.whl
.
File metadata
- Download URL: llm_sepweight-0.10.0-py3-none-any.whl
- Upload date:
- Size: 9.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 41b81a5a03506929ff9ad50286f01556a122ae35c6fa28f0131c01758dbf1db0 |
|
MD5 | 8bee69f00f416586a408f3900358323a |
|
BLAKE2b-256 | f3ee51215aa36f50420c17490dc1666c437b44d05a02c647b4938dc9f5eec266 |