Skip to main content

LibAUC: A Deep Learning Library for X-Risk Optimization

Project description

Logo by Zhuoning Yuan

LibAUC: A Deep Learning Library for X-Risk Optimization

PyPI version PyPI version Python Version PyTorch PyPI LICENSE

Website | Updates | Installation | Tutorial | Research | Github

We continuously update our library by making improvements and adding new features. If you use or like our library, please star:star: this repo. Thank you!

What is X-Risks?

X-risk refers to a family of compositional measures/losses, in which each data point is compared with a set of data points explicitly or implicitly for defining a risk function. It covers a family of widely used measures/losses, which can be organized into four interconnected categories:

  • Areas under the curves, including areas under ROC curves (AUROC), areas under Precision-Recall curves (AUPRC), one-way and two-wary partial areas under ROC curves.
  • Ranking measures/objectives, including p-norm push for bipartite ranking, listwise losses for learning to rank (e.g., listNet), mean average precision (mAP), normalized discounted cumulative gain (NDCG), etc.
  • Performance at the top, including top push, top-K variants of mAP and NDCG, Recall at top K positions (Rec@K), Precision at a certain Recall level (Prec@Rec), etc.
  • Contrastive objectives, including supervised contrastive objectives (e.g., NCA), and global self-supervised contrastive objectives improving upon SimCLR and CLIP.

Key Features

  • Easy Installation - Easy to install and insert LibAUC code into existing training pipeline with Deep Learning frameworks like PyTorch.
  • Broad Applications - Users can learn different neural network structures (e.g., MLP, CNN, GNN, transformer, etc) that support their data types.
  • Efficient Algorithms - Stochastic algorithms with provable theoretical convergence that support learning with millions of data points without larger batch size.
  • Hands-on Tutorials - Hands-on tutorials are provided for optimizing a variety of measures and objectives belonging to the family of X-risks.

Installation

$ pip install libauc==1.1.9rc2

The latest version 1.1.9rc2 is updated now! You can also download source code for previous version here.

Usage

Example training pipline for optimizing X-risk (e.g., AUROC)

>>> #import our loss and optimizer
>>> from libauc.losses import AUCMLoss 
>>> from libauc.optimizers import PESG 
...
>>> #define loss & optimizer
>>> Loss = AUCMLoss()
>>> optimizer = PESG()
...
>>> #training
>>> model.train()    
>>> for data, targets in trainloader:
>>>	data, targets  = data.cuda(), targets.cuda()
        logits = model(data)
	preds = torch.sigmoid(logits)
        loss = Loss(preds, targets) 
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
...	
>>> #update internal parameters
>>> optimizer.update_regularizer()

Tutorials

For tutorials, please visit https://github.com/Optimization-AI/LibAUC.

Citation

If you find LibAUC useful in your work, please acknowledge our library and cite the following papers:

@misc{libauc2022,
      title={LibAUC: A Deep Learning Library for X-Risk Optimization.},
      author={Zhuoning Yuan, Zi-Hao Qiu, Gang Li, Dixian Zhu, Zhishuai Guo, Quanqi Hu, Bokun Wang, Qi Qi, Yongjian Zhong, Tianbao Yang},
      year={2022}
	}
@article{dox22,
	title={Algorithmic Foundation of Deep X-risk Optimization},
	author={Tianbao Yang},
	journal={CoRR},
	year={2022}

Contact

For any technical questions, please open a new issue in the Github. If you have any other questions, please contact us @ Zhuoning Yuan [yzhuoning@gmail.com] and Tianbao Yang [tianbao-yang@uiowa.edu].

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

libauc-1.1.9rc2-py3-none-any.whl (71.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page