Skip to main content

Object Classification Code Base

Project description

Language: 🇺🇸 🇨🇳

«ZCls» is a classification model benchmark code base

Supported Recognizers:

Refer to roadmap for details

Table of Contents

Background

In order to further improve the algorithm performance, it is usually necessary to improve the existing model, which inevitably involves code refactoring. Creating this repo, on the one hand, serves as the CodeBase of the new model/optimization method, on the other hand, it also records the comparison between the custom model and the existing implementation (such as Torchvision Models)

Usage

Installation

$ pip install zcls

How to Use

  1. Add dataset path to config_file, like CIFAR100
  NAME: 'CIFAR100'
  TRAIN_ROOT: './data/cifar'
  TEST_ROOT: './data/cifar'

Note: current support CIFAR10/CIFAR100/FashionMNIST/ImageNet

  1. Add environment variable
$ export PYTHONPATH=/path/to/ZCls
  1. Train
$ CUDA_VISIBLE_DEVICES=0 python tool/train.py -cfg=configs/benchmarks/r50_cifar100_224_e100_rmsprop.yaml

After training, the corresponding model can be found in outputs/, add model path to xxx.yaml

    PRELOADED: ""
  1. Test
$ CUDA_VISIBLE_DEVICES=0 python tool/test.py -cfg=configs/benchmarks/r50_cifar100_224_e100_rmsprop.yaml
  1. If finished the training halfway, resume it like this
$ CUDA_VISIBLE_DEVICES=0 python tool/train.py -cfg=configs/benchmarks/r50_cifar100_224_e100_rmsprop.yaml --resume
  1. Use multiple GPU to train
$ CUDA_VISIBLE_DEVICES=0<,1,2,3> python tool/train.py -cfg=configs/benchmarks/r50_cifar100_224_e100_rmsprop.yaml -g=<N>

How to add Dataset

Suppose your dataset is in the following format

root/dog/xxx.png
root/dog/xxy.png
root/dog/xxz.png

root/cat/123.png
root/cat/nsdf3.png
root/cat/asd932_.png

modify config_file like this

DATASET:
  NAME: 'GeneralDataset'
  TRAIN_ROOT: /path/to/train_root
  TEST_ROOT: /path/to/test_root
  TOP_K: (1, 5)

Maintainers

  • zhujian - Initial work - zjykzj

Thanks

@misc{ding2021repvgg,
      title={RepVGG: Making VGG-style ConvNets Great Again}, 
      author={Xiaohan Ding and Xiangyu Zhang and Ningning Ma and Jungong Han and Guiguang Ding and Jian Sun},
      year={2021},
      eprint={2101.03697},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@misc{fan2020pyslowfast,
  author =       {Haoqi Fan and Yanghao Li and Bo Xiong and Wan-Yen Lo and
                  Christoph Feichtenhofer},
  title =        {PySlowFast},
  howpublished = {\url{https://github.com/facebookresearch/slowfast}},
  year =         {2020}
}

@misc{zhang2020resnest,
      title={ResNeSt: Split-Attention Networks}, 
      author={Hang Zhang and Chongruo Wu and Zhongyue Zhang and Yi Zhu and Haibin Lin and Zhi Zhang and Yue Sun and Tong He and Jonas Mueller and R. Manmatha and Mu Li and Alexander Smola},
      year={2020},
      eprint={2004.08955},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@misc{ding2019acnet,
      title={ACNet: Strengthening the Kernel Skeletons for Powerful CNN via Asymmetric Convolution Blocks}, 
      author={Xiaohan Ding and Yuchen Guo and Guiguang Ding and Jungong Han},
      year={2019},
      eprint={1908.03930},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@misc{howard2019searching,
      title={Searching for MobileNetV3}, 
      author={Andrew Howard and Mark Sandler and Grace Chu and Liang-Chieh Chen and Bo Chen and Mingxing Tan and Weijun Wang and Yukun Zhu and Ruoming Pang and Vijay Vasudevan and Quoc V. Le and Hartwig Adam},
      year={2019},
      eprint={1905.02244},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@misc{cao2019gcnet,
      title={GCNet: Non-local Networks Meet Squeeze-Excitation Networks and Beyond}, 
      author={Yue Cao and Jiarui Xu and Stephen Lin and Fangyun Wei and Han Hu},
      year={2019},
      eprint={1904.11492},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

For more thanks, check THANKS

Contributing

Anyone's participation is welcome! Open an issue or submit PRs.

Small note:

License

Apache License 2.0 © 2020 zjykzj

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zcls-0.4.0.tar.gz (69.6 kB view details)

Uploaded Source

Built Distribution

zcls-0.4.0-py2.py3-none-any.whl (140.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file zcls-0.4.0.tar.gz.

File metadata

  • Download URL: zcls-0.4.0.tar.gz
  • Upload date:
  • Size: 69.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for zcls-0.4.0.tar.gz
Algorithm Hash digest
SHA256 88b187130e8429e5cf8e80a40adccd91300fdef14db6fee39d1b3d94ad889f05
MD5 edeab1406931778f3606c61ba0b7e42f
BLAKE2b-256 3d8c7867ad3775a060971ee1e901e8a5361ffe64282c89997bf8fbf44a72675a

See more details on using hashes here.

File details

Details for the file zcls-0.4.0-py2.py3-none-any.whl.

File metadata

  • Download URL: zcls-0.4.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 140.1 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for zcls-0.4.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 162c03c9089a404a17ed3d51d1edf279a6464fe315438f096ff7aba3a4b4d1ee
MD5 74ad447558411c851c2af6c5a0728cc7
BLAKE2b-256 ae56db123d2438c078195f72b47d135f163d7c93948aa659b2e61a803fd666cd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page