Skip to main content

Pytorch implementation of a differentiable topk function, a balanced and imbalanced top-k loss for deep learning

Project description

This is the code of the paper Stochastic smoothing of the top-K calibrated hinge loss for deep imbalanced classification published at ICML2022. If you use this code for your work, please cite the paper:

@inproceedings{pmlr-v162-garcin22a,
  title = 	 {Stochastic smoothing of the top-K calibrated hinge loss for deep imbalanced classification},
  author =       {Garcin, Camille and Servajean, Maximilien and Joly, Alexis and Salmon, Joseph},
  booktitle = 	 {Proceedings of the 39th International Conference on Machine Learning},
  pages = 	 {7208--7222},
  year = 	 {2022},
  volume = 	 {162},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {17--23 Jul},
  publisher =    {PMLR},
}

The pytopk package

The pytopk package contains the code for the balanced and imbalanced top-k losses as well as a differentiable top-k function.

Installation

It can be installed as follows:

pip install pytopk

Top-k losses

Our losses can be used as standard pytorch loss functions:

import torch
from pytopk import BalNoisedTopK, ImbalNoisedTopK

scores = torch.tensor([[2.0, 1.5, -3.0],
                       [7.5, 4.0, -1.5]])
labels = torch.tensor([0, 2])

k = 2

criteria_bal = BalNoisedTopK(k=k, epsilon=1.0)
criteria_imbal = ImbalNoisedTopK(k=k, epsilon=0.01, max_m=0.3, cls_num_list=[17, 23, 55])

loss_batch_bal = criteria_bal(scores, labels)
loss_batch_imbal = criteria_imbal(scores, labels)

Smooth top-k function

We also provide a differentiable top-k function for tensors of any size that can be plugged into any neural network architecture:

import torch
from pytopk import NoisedTopK

smooth_topk = NoisedTopK(k=3, dim=-1)
x = torch.tensor([[-1.5, 2.0, 0.7, 3.8],
                  [-1.1, -5.4, 0.1, 2.3]], requires_grad=True)
out = smooth_topk(x)
print(out)

>> tensor([ 0.4823, -1.4710], grad_fn=<_NoisedTopKBackward>)

out.sum().backward()
print(x.grad)

>> tensor([[0.0000, 0.4000, 0.6000, 0.0000],
          [0.8000, 0.0000, 0.2000, 0.0000]])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytopk-0.1.0.tar.gz (5.7 kB view details)

Uploaded Source

Built Distribution

pytopk-0.1.0-py3-none-any.whl (6.9 kB view details)

Uploaded Python 3

File details

Details for the file pytopk-0.1.0.tar.gz.

File metadata

  • Download URL: pytopk-0.1.0.tar.gz
  • Upload date:
  • Size: 5.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.2

File hashes

Hashes for pytopk-0.1.0.tar.gz
Algorithm Hash digest
SHA256 606b39278281139b25016275219fb3f1c65ac22a0c6d9312c63e77029539ab10
MD5 9b6d3ac344ce7a202f33e7d5506adb05
BLAKE2b-256 d95e1d4cc247572706d70d8c0457deb4016d27ebbbff2dc49be480d9dbbc9369

See more details on using hashes here.

File details

Details for the file pytopk-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: pytopk-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 6.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.2

File hashes

Hashes for pytopk-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 41baec45ec321c2bb99c9e9a1d12a597432f077d1f0c9c63579b6d9f3a92ac24
MD5 56b4544feee3cb1a62ae7326cf2f70f7
BLAKE2b-256 d785704f37680d310bec09bdb0f7f08d68feeed6da6799371028435aa915fac2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page