A pytorch implementation of Conditionals for Ordinal Regression
Project description
CONDOR pytorch implementation for ordinal regression with deep neural networks.
Documentation: https://GarrettJenkinson.github.io/condor_pytorch
About
CONDOR, short for CONDitionals for Ordinal Regression, is a method for ordinal regression with deep neural networks, which addresses the rank inconsistency issue of other ordinal regression frameworks.
It is compatible with any state-of-the-art deep neural network architecture, requiring only modification of the output layer, the labels, the loss function.
This repository implements the CONDOR functionality (neural network layer, loss function, and dataset utilities) for convenient use. Examples are provided via the "Tutorials" that can be found on the documentation website at https://GarrettJenkinson.github.io/condor_pytorch.
We also have CONDOR implemented for Tensorflow.
Installation or Docker
You can install the latest stable release of condor_pytorch
directly from Python's package index via pip
by executing the following code from your command line:
pip install condor-pytorch
The dependencies can also be pip installed also using the included requirements.txt
:
pip install -r requirements.txt
We also provide Dockerfile's to help get up and started quickly with condor_pytorch
.
The cpu image can be built and ran as follows, with tutorial jupyter notebooks
built in.
# Create a docker image, only done once
docker build -t cpu_pytorch -f cpu.Dockerfile ./
# run image to serve a jupyter notebook
docker run -it -p 8888:8888 --rm cpu_pytorch
# how to run bash inside container (with python that will have deps)
docker run -u $(id -u):$(id -g) -it -p 8888:8888 --rm cpu_pytorch bash
An NVIDIA based gpu optimized container can be built and run as follows (without interactive ipynb capabilities).
# only needs to be built once
docker build -t gpu_pytorch -f gpu.Dockerfile ./
# use the image after building it
docker run -it -p 8888:8888 --rm gpu_pytorch
Cite as
If you use CONDOR as part of your workflow in a scientific publication, please consider citing the CONDOR repository with the following DOI:
@article{condor2021,
title = "Universally rank consistent ordinal regression in neural networks",
journal = "arXiv",
volume = "2110.07470",
year = "2021",
url = "https://arxiv.org/abs/2110.07470",
author = "Garrett Jenkinson and Kia Khezeli and Gavin R. Oliver and John Kalantari and Eric W. Klee",
keywords = "Deep learning, Ordinal regression, neural networks, Machine learning, Biometrics"
}
Acknowledgments: Many thanks to the CORAL ordinal authors and the CORAL pytorch authors whose repos provided a roadmap for this codebase.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file condor_pytorch-1.1.0.tar.gz
.
File metadata
- Download URL: condor_pytorch-1.1.0.tar.gz
- Upload date:
- Size: 12.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | dd76af6bb3bd4cf850af05713d7c0ebf97ae4756375c9b4fd8e2bb6907612a2c |
|
MD5 | daaf0a9f4a15aad73f49547914472277 |
|
BLAKE2b-256 | 4c4ff13a6ed16290e2800b46139114b925386e63efd3d7d8356094b317c95032 |
File details
Details for the file condor_pytorch-1.1.0-py3-none-any.whl
.
File metadata
- Download URL: condor_pytorch-1.1.0-py3-none-any.whl
- Upload date:
- Size: 13.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 785f6994c911d10a5a9b8dccea691a43d26ea9b38057ff4b7e015bc5be915c08 |
|
MD5 | dcdd09e02ad4dc4466eb66faf0564b43 |
|
BLAKE2b-256 | 34a4948f41d05b3e840e152d41fac8b4cad5586a000d83684dae6ec8957058bb |