Skip to main content

A Pytorch Library to help extend all Knowledge Distillation works

Project description

KD_Lib

https://img.shields.io/travis/SforAiDl/KD_Lib.svg

A Pytorch Library to help extend all Knowledge Distillation works

Installation :

Stable release

KD_Lib is compatible with Python 3.6 or later and also depends on pytorch. The easiest way to install KD_Lib is with pip, Python’s preferred package installer.

$ pip install KD-Lib

Note that KD_Lib is an active project and routinely publishes new releases. In order to upgrade KD_Lib to the latest version, use pip as follows.

$ pip install -U KD-Lib

Build from source

If you intend to install the latest unreleased version of the library (i.e from source), you can simply do:

$ git clone https://github.com/SforAiDl/KD_Lib.git
$ cd KD_lib
$ python setup.py install

Currently implemented works

Original MNIST Paper: https://arxiv.org/abs/1503.02531
Improved Knowledge Distillation via Teacher Assistant: https://arxiv.org/abs/1902.03393
Relational Knowledge Distillation: https://arxiv.org/abs/1904.05068
Distilling Knowledge from Noisy Teachers: https://arxiv.org/pdf/1610.09650.pdf
Paying More Attention To The Attention - Improving the Performance of CNNs via Attention Transfer: https://arxiv.org/pdf/1612.03928.pdf

History

0.0.1 (2020-05-11)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

KD_Lib-0.0.3.tar.gz (9.3 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page