naifu is designed for training generative models with various configurations and features.
Project description
Naifu
naifu-diffusion (or naifu) is designed for training generative models with various configurations and features. The code in the main branch of this repository is under development and subject to change as new features are added.
Other branches in the repository include:
- sgm - Uses the sgm to train SDXL models.
- main-archived - Contains the original naifu-diffusion code for training Stable Diffusion 1.x models.
Installation
To install the necessary dependencies, simply run:
git clone https://github.com/mikubill/naifu-diffusion
pip install -r requirements.txt
Usage
You can train the image model using different configurations by running the trainer.py
script with the appropriate configuration file.
python trainer.py --config config/<config_file>
python trainer.py config/<config_file>
Replace <config_file>
with one of the available configuration files listed below.
Available Configurations
Choose the appropriate configuration file based on training objectives and environment.
Train SDXL (Stable Diffusion XL) model
# stabilityai/stable-diffusion-xl-base-1.0
python trainer.py config/train.yaml
Train SDXL refiner (Stable Diffusion XL refiner) model
# stabilityai/stable-diffusion-xl-refiner-1.0
python trainer.py config/train_refiner.yaml
Train original Stable Diffusion 1.4 or 1.5 model
# runwayml/stable-diffusion-v1-5
# Note: will save in diffusers format
python trainer.py config/train_sd15.yaml
Train SDXL model with diffusers backbone
# stabilityai/stable-diffusion-xl-base-1.0
# Note: will save in diffusers format
python trainer.py config/train_diffusers.yaml
Train SDXL model with LyCORIS.
# Based on the work available at KohakuBlueleaf/LyCORIS
pip install lycoris_lora toml
python trainer.py config/train_lycoris.yaml
Use fairscale strategy for distributed data parallel sharded training
pip install fairscale
python trainer.py config/train_fairscale.yaml
Train SDXL model with Diffusion DPO
Paper: Diffusion Model Alignment Using Direct Preference Optimization (arxiv:2311.12908)
# dataset: yuvalkirstain/pickapic_v2
# Be careful tuning the resolution and dpo_betas!
# will save in diffusers format
python trainer.py config/train_dpo_hfdataset.yaml
Train Pixart-Alpha model
Paper: Fast Training of Diffusion Transformer for Photorealistic Text-to-Image Synthesis (arxiv:2310.00426)
# PixArt-alpha/PixArt-XL-2-1024-MS
python trainer.py config/train_pixart.yaml
Train SDXL-LCM model
Paper: Latent Consistency Models: Synthesizing High-Resolution Images with Few-Step Inference (arxiv:2310.04378)
# PixArt-alpha/PixArt-XL-2-1024-MS
python trainer.py config/train_lcm.yaml
Preparing Datasets
Each configuration file may have different dataset requirements. Make sure to check the specific configuration file for any dataset specifications or requirements.
You can use your dataset directly for training. Simply point the configuration file to the location of your dataset. If you want to reduce the VRAM usage during training, you can encode your dataset to latents using the encode_latents.py
script.
# prepare images in input_path
python encode_latents.py -i <input_path> -o <encoded_path>
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file naifu-0.1.3.tar.gz
.
File metadata
- Download URL: naifu-0.1.3.tar.gz
- Upload date:
- Size: 74.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3240a31aa1aef44f7da48e9568316d21101c870eb0a8f0c4456c8b29e2f2cf2c |
|
MD5 | 92b4ad93577a8de00a65303334445d57 |
|
BLAKE2b-256 | 69c0d598ff6ff7abde769e52863f4db48c05316b68dde57da59493dd690c6547 |
File details
Details for the file naifu-0.1.3-py3-none-any.whl
.
File metadata
- Download URL: naifu-0.1.3-py3-none-any.whl
- Upload date:
- Size: 91.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fe8cf12d7076faa7ae10fc73a053c41b617f923e29d54171b38d9171c8231eb1 |
|
MD5 | 26aed6a68df9fcfb7db5547c707666d1 |
|
BLAKE2b-256 | f1d7848386dc00def2745cd45295c33e1d55320394a448627dcbd656dea883a1 |