LibAUC: A Deep Learning Library for X-Risk Optimization
Project description
LibAUC: A Deep Learning Library for X-Risk Optimization
| Documentation | Installation | Website | Tutorial | Research | Github |
News
- [2024/04/07]: Bugs fixed: We fixed a bug in datasets/folder.py by returning a return_index to support SogCLR/iSogCLR for contrastive learning. Fixed incorrect communication with all_gather in GCLoss_v1 and set gamma to original value when u is not 0. None of these were in our experimental code of the paper.
- [2024/02/11]: A Bug fixed: We fixed a bug in the calculation of AUCM loss and MultiLabelAUCM loss (the margin parameter is missed in the original calculation which might cause the loss to be negative). However, it does not affect the learning as the updates are not affected by this. Both the source code and pip install are updated.
- [2023/06/10]: LibAUC 1.3.0 is now available! In this update, we have made improvements and introduced new features. We also release a new documentation website at https://docs.libauc.org/. Please see the release notes for details.
- [2023/06/10]: We value your thoughts and feedback! Please consider filling out this brief survey to guide our future developments. Thank you!
Why LibAUC?
LibAUC offers an easier way to directly optimize commonly-used performance measures and losses with user-friendly API. LibAUC has broad applications in AI for tackling many challenges, such as Classification of Imbalanced Data (CID), Learning to Rank (LTR), and Contrastive Learning of Representation (CLR). LibAUC provides a unified framework to abstract the optimization of many compositional loss functions, including surrogate losses for AUROC, AUPRC/AP, and partial AUROC that are suitable for CID, surrogate losses for NDCG, top-K NDCG, and listwise losses that are used in LTR, and global contrastive losses for CLR. Here’s an overview:
Installation
Installing from pip
$ pip install -U libauc
Installing from source
$ git clone https://github.com/Optimization-AI/LibAUC.git
$ cd LibAUC
$ pip install .
For more details, please check the latest release note.
Usage
Example training pipline for optimizing X-risk (e.g., AUROC)
>>> #import our loss and optimizer
>>> from libauc.losses import AUCMLoss
>>> from libauc.optimizers import PESG
>>> #pretraining your model through supervised learning or self-supervised learning
>>> #load a pretrained encoder and random initialize the last linear layer
>>> #define loss & optimizer
>>> Loss = AUCMLoss()
>>> optimizer = PESG()
...
>>> #training
>>> model.train()
>>> for data, targets in trainloader:
>>> data, targets = data.cuda(), targets.cuda()
logits = model(data)
preds = torch.sigmoid(logits)
loss = Loss(preds, targets)
optimizer.zero_grad()
loss.backward()
optimizer.step()
...
>>> #update internal parameters
>>> optimizer.update_regularizer()
Tutorials
X-Risk Minimization
- Optimizing AUCMLoss: [example]
- Optimizing APLoss: [example]
- Optimizing CompositionalAUCLoss: [example]
- Optimizing pAUCLoss: [example]
- Optimizing MIDAMLoss: [example]
- Optimizing NDCGLoss: [example]
- Optimizing GCLoss (Unimodal): [example]
- Optimizing GCLoss (Bimodal): [example]
Other Applications
- Constructing benchmark imbalanced datasets for CIFAR10, CIFAR100, CATvsDOG, STL10
- Using LibAUC with PyTorch learning rate scheduler
- Optimizing AUROC loss on Chest X-Ray dataset (CheXpert)
- Optimizing AUROC loss on Skin Cancer dataset (Melanoma)
- Optimizing multi-label AUROC loss on Chest X-Ray dataset (CheXpert)
- Optimizing AUROC loss on Tabular dataset (Credit Fraud)
- Optimizing AUROC loss for Federated Learning
- Optimizing GCLoss (Bimodal with Cosine Gamma)
Citation
If you find LibAUC useful in your work, please cite the following papers:
@inproceedings{yuan2023libauc,
title={LibAUC: A Deep Learning Library for X-Risk Optimization},
author={Zhuoning Yuan and Dixian Zhu and Zi-Hao Qiu and Gang Li and Xuanhui Wang and Tianbao Yang},
booktitle={29th SIGKDD Conference on Knowledge Discovery and Data Mining},
year={2023}
}
@article{yang2022algorithmic,
title={Algorithmic Foundations of Empirical X-Risk Minimization},
author={Yang, Tianbao},
journal={arXiv preprint arXiv:2206.00439},
year={2022}
}
Contact
For any technical questions, please open a new issue in the Github. If you have any other questions, please contact us via libaucx@gmail.com or tianbao-yang@tamu.edu.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file libauc-1.4.0.tar.gz
.
File metadata
- Download URL: libauc-1.4.0.tar.gz
- Upload date:
- Size: 92.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | aabede0f32d16eb8e1bd7491af6e45c0326fb488e7943c7a38f5932fd6c9d37f |
|
MD5 | d576f1e13a3709ceaedd184be78a6c7e |
|
BLAKE2b-256 | 2a01dd383f7983b2c82ee4ef60013ff1fd2ef46502823461dbbab4fe630162af |
File details
Details for the file libauc-1.4.0-py3-none-any.whl
.
File metadata
- Download URL: libauc-1.4.0-py3-none-any.whl
- Upload date:
- Size: 130.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6034f0943c775cbc4322af23aa922b36bc9789053e25aeab6c3cc9c2bfc1ce4b |
|
MD5 | 1bc0bcda62d93674405bdb2938fbf75a |
|
BLAKE2b-256 | e21b0f89637169e8b56fbd3f873460a45549f30b48cf780c3b3d6b099db407ec |