Skip to main content

Mini-Lightning is a lightweight machine learning training library, which is a mini version of Pytorch-Lightning with only 1k lines of code. It has the advantages of faster, more concise and more flexible.

Project description

Mini-Lightning

Python Version Pytorch Version PyPI Status License Downloads

Introduction

  1. Mini-Lightning is a lightweight machine learning training library, which is a mini version of Pytorch-Lightning with only 1k lines of code. It has the advantages of faster, more concise and more flexible.
  2. Existing features: support for DDP(multi-node and multi-gpu), Sync-BN, DP, MP(model parallelism), AMP, gradient accumulation, warmup and lr_scheduler, grad clip, tensorboard, huggingface, peft, LLM, torchmetrics, model and result saving, beautiful console log, etc.
  3. Only the minimal interfaces are exposed, keeping the features of simplicity, easy to read, use and extend.
  4. examples can be found in examples/
  5. If you have any problems or bug finding, please raise issue, Thank you.

Installation

  1. Create a virtual environment and install Python (>= 3.8)
  2. Download the latest version (>=1.12) of Torch(corresponding CUDA version) from the official website of PyTorch.
  3. Install mini-lightning
# from pypi
pip install mini-lightning -U

# Or download the files from the repository to local,
# and go to the folder where setup.py is located, and run the following command
# (Recommended) You can enjoy the latest features and functions (including bug fixes)
pip install -e .  # -e: editable mode

Examples

  1. First, you need to install the Mini-Lightning
  2. Run the following examples
### test environment
python examples/test_env.py

### cv
pip install "torchvision>=0.13"
python examples/cv.py
# cv+dp (not recommended, please use DDP)
python examples/cv.py  # setting device_ids=[0, 1]

### nlp: bert gpt
pip install "transformers>=4.25" "datasets>=2.7" "peft>=0.3"
python examples/nlp_bert_mlm.py
python examples/nlp_bert_seq_cls.py
python examples/nlp_gpt_lm.py
python examples/nlp_gpt_seq_cls.py
# sft
python examples/nlp_gpt_zh_sft_adapter.py
python examples/nlp_gpt_zh_sft_lora.py
# llm (model parallelism)
#   Ref: https://modelscope.cn/models/baichuan-inc/baichuan-7B/summary
python examples/nlp_baichuan_sft_lora.py
#   Ref: https://modelscope.cn/models/ZhipuAI/chatglm2-6b/summary
python examples/nlp_chatglm2_sft_lora.py

### dqn
pip install "gym>=0.26.2" "pygame>=2.1.2"
python examples/dqn.py

### gan
pip install "torchvision>=0.13"
python examples/gan.py

### contrastive learning
pip install "torchvision>=0.13" "scikit-learn>=1.2"
python examples/cl.py
# cl+ddp
torchrun --nproc_per_node 2 examples/cl_ddp.py --device_ids 0 1

### gnn
# download torch_geometric
#   Ref: https://pytorch-geometric.readthedocs.io/en/latest/notes/installation.html
python examples/gnn_node.py
python examples/gnn_edge.py
python examples/gnn_graph.py

### ae
pip install "torchvision>=0.13" "scikit-learn>=1.2"
python examples/ae.py

### vae
pip install "torchvision>=0.13"
python examples/vae.py

### meta learning
pip install "torchvision>=0.13"
python examples/meta_learning.py


########## ddp
# torchrun (Recommended)
#   Ref: https://pytorch.org/docs/stable/elastic/run.html
# spawn
#   Ref: https://pytorch.org/docs/stable/notes/ddp.html
## single-node, multi-gpu
torchrun --nproc_per_node 2 examples/cv_ddp.py --device_ids 0 1
python cv_ddp_spawn.py  # setting device_ids=[0, 1]

## multi-node
# default: --master_port 29500, or set master_port to prevents port conflicts.
torchrun --nnodes 2 --node_rank 0 --master_addr 127.0.0.1 --nproc_per_node 4 examples/cv_ddp.py --device_ids 0 1 2 3
torchrun --nnodes 2 --node_rank 1 --master_addr xxx.xxx.xxx.xxx --nproc_per_node 4 examples/cv_ddp.py --device_ids 0 1 2 3

TODO

  1. Automatic parameter adjustment
  2. Examples: Audio, Meta-learning, Diffusion, Auto-regressive, Reinforcement Learning
  3. Support multi-gpu test
  4. Output .log file

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mini-lightning-0.2.2.tar.gz (24.9 kB view details)

Uploaded Source

File details

Details for the file mini-lightning-0.2.2.tar.gz.

File metadata

  • Download URL: mini-lightning-0.2.2.tar.gz
  • Upload date:
  • Size: 24.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.12

File hashes

Hashes for mini-lightning-0.2.2.tar.gz
Algorithm Hash digest
SHA256 bc23a354610a81f719b7a6a1e7afd8a9e15e4d72760625be2f36459f077bb88c
MD5 0fe9ac22f71cd2bbd453e6b9c16c595e
BLAKE2b-256 8264c6b96a859841111902a220b71a5117dfccc5cded79e4798b9d1da261d4ed

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page