Skip to main content

Mixup for Supervision, Semi- and Self-Supervision Learning Toolbox and Benchmark

Project description

OpenMixup

release PyPI docs license open issues issue resolution

📘Documentation | 🛠️Installation | 🚀Model Zoo | 👀Awesome Mixup | 🔍Awesome MIM | 🆕News

Introduction

The main branch works with PyTorch 1.8 (required by some self-supervised methods) or higher (we recommend PyTorch 1.12). You can still use PyTorch 1.6 for supervised classification methods.

OpenMixup is an open-source toolbox for supervised, self-, and semi-supervised visual representation learning with mixup based on PyTorch, especially for mixup-related methods. Recently, OpenMixup is on updating to adopt new features and code structures of OpenMMLab 2.0 (#42).

Major Features
  • Modular Design. OpenMixup follows a similar code architecture of OpenMMLab projects, which decompose the framework into various components, and users can easily build a customized model by combining different modules. OpenMixup is also transplantable to OpenMMLab projects (e.g., MMPreTrain).

  • All in One. OpenMixup provides popular backbones, mixup methods, semi-supervised, and self-supervised algorithms. Users can perform image classification (CNN & Transformer) and self-supervised pre-training (contrastive and autoregressive) under the same framework.

  • Standard Benchmarks. OpenMixup supports standard benchmarks of image classification, mixup classification, self-supervised evaluation, and provides smooth evaluation on downstream tasks with open-source projects (e.g., object detection and segmentation on Detectron2 and MMSegmentation).

  • State-of-the-art Methods. Openmixup provides awesome lists of popular mixup and self-supervised methods. OpenMixup is updating to support more state-of-the-art image classification and self-supervised methods.

Table of Contents
  1. Introduction
  2. News and Updates
  3. Installation
  4. Getting Started
  5. Overview of Model Zoo
  6. Change Log
  7. License
  8. Acknowledgement
  9. Contributors
  10. Contributors and Contact

News and Updates

[2023-12-23] OpenMixup v0.2.9 is released, updating more features in mixup augmentations, self-supervised learning, and optimizers.

Installation

OpenMixup is compatible with Python 3.6/3.7/3.8/3.9 and PyTorch >= 1.6. Here are quick installation steps for development:

conda create -n openmixup python=3.8 pytorch=1.12 cudatoolkit=11.3 torchvision -c pytorch -y
conda activate openmixup
pip install openmim
mim install mmcv-full
git clone https://github.com/Westlake-AI/openmixup.git
cd openmixup
python setup.py develop

Please refer to install.md for more detailed installation and dataset preparation.

Getting Started

OpenMixup supports Linux and macOS. It enables easy implementation and extensions of mixup data augmentation methods in existing supervised, self-, and semi-supervised visual recognition models. Please see get_started.md for the basic usage of OpenMixup.

Training and Evaluation Scripts

Here, we provide scripts for starting a quick end-to-end training with multiple GPUs and the specified CONFIG_FILE.

bash tools/dist_train.sh ${CONFIG_FILE} ${GPUS} [optional arguments]

For example, you can run the script below to train a ResNet-50 classifier on ImageNet with 4 GPUs:

CUDA_VISIBLE_DEVICES=0,1,2,3 PORT=29500 bash tools/dist_train.sh configs/classification/imagenet/resnet/resnet50_4xb64_cos_ep100.py 4

After training, you can test the trained models with the corresponding evaluation script:

bash tools/dist_test.sh ${CONFIG_FILE} ${GPUS} ${PATH_TO_MODEL} [optional arguments]

Development

Please see Tutorials for more developing examples and tech details:

Downetream Tasks for Self-supervised Learning

Useful Tools

(back to top)

Overview of Model Zoo

Please run experiments or find results on each config page. Refer to Mixup Benchmarks for benchmarking results of mixup methods. View Model Zoos Sup and Model Zoos SSL for a comprehensive collection of mainstream backbones and self-supervised algorithms. We also provide the paper lists of Awesome Mixups and Awesome MIM for your reference. Please view config files and links to models at the following config pages. Checkpoints and training logs are on updating!

(back to top)

Change Log

Please refer to changelog.md for more details and release history.

License

This project is released under the Apache 2.0 license. See LICENSE for more information.

Acknowledgement

  • OpenMixup is an open-source project for mixup methods and visual representation learning created by researchers in CAIRI AI Lab. We encourage researchers interested in backbone architectures, mixup augmentations, and self-supervised learning methods to contribute to OpenMixup!
  • This project borrows the architecture design and part of the code from MMPreTrain and the official implementations of supported algorisms.

(back to top)

Citation

If you find this project useful in your research, please consider star OpenMixup or cite our tech report:

@article{li2022openmixup,
  title = {OpenMixup: A Comprehensive Mixup Benchmark for Visual Classification},
  author = {Siyuan Li and Zedong Wang and Zicheng Liu and Di Wu and Cheng Tan and Stan Z. Li},
  journal = {ArXiv},
  year = {2022},
  volume = {abs/2209.04851}
}

(back to top)

Contributors and Contact

For help, new features, or reporting bugs associated with OpenMixup, please open a GitHub issue and pull request with the tag "help wanted" or "enhancement". For now, the direct contributors include: Siyuan Li (@Lupin1998), Zedong Wang (@Jacky1128), and Zicheng Liu (@pone7). We thank all public contributors and contributors from MMPreTrain (MMSelfSup and MMClassification)!

This repo is currently maintained by:

(back to top)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openmixup-0.2.9.tar.gz (1.0 MB view details)

Uploaded Source

Built Distribution

openmixup-0.2.9-py2.py3-none-any.whl (753.5 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file openmixup-0.2.9.tar.gz.

File metadata

  • Download URL: openmixup-0.2.9.tar.gz
  • Upload date:
  • Size: 1.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.13

File hashes

Hashes for openmixup-0.2.9.tar.gz
Algorithm Hash digest
SHA256 843d1e3e8c58f7ce2a4cef40465a16d3d096f736a2b8fa4745857793f9f78ccb
MD5 f039f11cc2c19c05ec236c3c76ee8087
BLAKE2b-256 dc8374cd335ced106d42ae3b98bf97ee6dafd4c552a3e37d3908a0a2c72db2ee

See more details on using hashes here.

File details

Details for the file openmixup-0.2.9-py2.py3-none-any.whl.

File metadata

  • Download URL: openmixup-0.2.9-py2.py3-none-any.whl
  • Upload date:
  • Size: 753.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.13

File hashes

Hashes for openmixup-0.2.9-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 a6eee661e76daffff65cdd475ad77c36233d892f53f07f50ab40681a686463e5
MD5 3219fac9b3010d6b506956c9b9855851
BLAKE2b-256 6c9e4854611bfdc715acee6f68d65087593da5c808557073f69a1805590ddbcb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page