A short description of your project
Project description
Run large language models in a heterogeneous decentralized environment with offloading.
The rapid rise of generative AI has boosted demand for large language model (LLM) inference and fine-tuining services. While proprietary models are still favored, advancements in open-source LLMs have made them competitive. However, high costs and limited GPU resources hinder deployment. This work introduces BloomBee, a decentralized offline serving system that leverages idle GPU resources to provide cost-effective access to LLMs.
We rely on global GPU sharing, which includes more consumer-grade GPUs. If your GPU can only manage a small portion of a large language model, like the Llama3.1 (405B) model, you can connect to a network of servers that load different parts of the model. In this network, you can request inference or fine-tuning services.
Installation
From Pypi
pip install bloombee
From Source
git clone https://github.com/yottalabsai/BloomBee.git
cd BloomBee
pip install .
How to use BloomBee(Try now in Colab)
1. Start the main server
python -m bloombee.cli.run_dht --host_maddrs /ip4/0.0.0.0/tcp/31340 --identity_path bootstrapp1.id
Now you will get the BloomBee's main server location:
Mon 00 01:23:45.678 [INFO] Running a DHT instance. To connect other peers to this one, use --initial_peers /ip4/YOUR_IP_ADDRESS/tcp/31340/p2p/QmefxzDL1DaJ7TcrZjLuz7Xs9sUVKpufyg7f5276ZHFjbQ
You can provide this address as --initial_peers to workers or other backbone servers.
If you want your swarm to be accessible outside of your local network, ensure that you have a public IP address or set up port forwarding correctly, so that your peer is reachable from the outside.
2. Connect the workers to the main bloombee server
Here is the BloomBee Server location:
export BBSERVER=/ip4/10.52.2.249/tcp/31340/p2p/QmefxzDL1DaJ7TcrZjLuz7Xs9sUVKpufyg7f5276ZHFjbQ
Start one worker to hold 16 blocks (16 tranformer layers)
python -m bloombee.cli.run_server huggyllama/llama-7b --initial_peers $BBSERVER --num_blocks 16 --identity_path bootstrap_1.id
Start second worker to hold another 16 blocks (16 tranformer layers)
python -m bloombee.cli.run_server huggyllama/llama-7b --initial_peers $BBSERVER --num_blocks 16 --identity_path bootstrap_1.id
3. Run inference or finetune jobs
Inference
cd Bloombee/
python benchmarks/benchmark_inference.py --model huggyllama/llama-7b --initial_peers $BBSERVER --torch_dtype float32 --seq_len 128
Finetune
cd Bloombee/
python benchmarks/benchmark_training.py --model huggyllama/llama-7b --initial_peers $BBSERVER --torch_dtype float32 --n_steps 20 --batch_size 32 --seq_len 128
Acknowledgements
Bloombee is built upon a few popular libraries:
- Hivemind - A PyTorch library for decentralized deep learning across the Internet.
- FlexLLMGen - An offloading-based system running on weak GPUs.
- Petals - A library for decentralized LLMs fine-tuning and inference without offloading.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bloombee-0.1.3.tar.gz.
File metadata
- Download URL: bloombee-0.1.3.tar.gz
- Upload date:
- Size: 114.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.4.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.8.0 tqdm/4.30.0 CPython/3.8.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a138fc98911d10a42432c40a7e34a710815e086e7826cf2f3b0b6e6ea0c7c05f
|
|
| MD5 |
3a7eee2d440fadedf8811beb478dab9b
|
|
| BLAKE2b-256 |
b1b2c7b279b7d61f7a5891dd7213347171221cbd3386e416b1fd239fc73ec002
|
File details
Details for the file bloombee-0.1.3-py3-none-any.whl.
File metadata
- Download URL: bloombee-0.1.3-py3-none-any.whl
- Upload date:
- Size: 126.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.4.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.8.0 tqdm/4.30.0 CPython/3.8.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
49181a6ff2cea07ffb11a4958e9bc501c405ba8d3eb30ebd70af27b04d8788c8
|
|
| MD5 |
152a9129c11af5bfb5964730fc892897
|
|
| BLAKE2b-256 |
ef183fe30d6f34b1e34502d89d6e30fdb4d77f196086cd154933815b681743a0
|