PyTorch-based package for constrained training of neural networks
Project description
humancompatible-train: a package for constrained machine learning
The toolkit implements algorithms for constrained training of neural networks based on PyTorch, and inspired by PyTorch's API.
Table of Contents
- Basic installation instructions
- Using the toolkit
- Extending the toolkit
- License and terms of use
- References
humancompatible-train is still under active development! If you find bugs or have feature requests, please file a Github issue.
Installation
Use
pip install humancompatible-train
The only dependencies of this package are numpy and torch.
Using the toolkit
The toolkit implements algorithms for constrained training of neural networks based on PyTorch.
The algorithms are intended for use in tandem with classic PyTorch optimizers, calculating the Lagrangian and keeping track of the dual variables.
In general, your code using humancompatible-train would look something like this:
optimizer = torch.optim.Adam(model.parameters(), ...)
dual_optimizer = humancompatible.train.dual_optim.ALM(...)
for inputs, labels in dataloader:
# evaluate objective
outputs = model(inputs)
loss = criterion(outputs, labels)
# evaluate tensor of constraints
constraints = <eval_your_constraints>(inputs, labels)
# evaluate lagrangian and update dual variables
lgr = dual_optimizer.forward_update(loss, constraints)
# backward pass and step
lgr.backward()
optimizer.step()
optimizer.zero_grad()
The key difference is calculating the lagrangian using lgr = forward_update(loss, constraints), and then running lgr.backward() instead of loss.backward().
Our idea is to
- Deviate minimally from the usual PyTorch workflow
- Make different stochastic-constrained stochastic optimization algorithms nearly interchangable in the code.
Code examples
You are invited to check out our new API presented in notebooks in the examples folder.
The example notebooks have additional dependencies for data and plotting, such as fairret. To install those, run
pip install humancompatible-train[examples]
Extending the toolkit
To add a new algorithm, you can subclass the PyTorch Optimizer class and proceed following the API guideline presented above.
Reproducing the Benchmark
The code for benchmarking constrained regularization algorithms is available in the benchmark directory.
Installation instructions
- Create a virtual environment
bash (Linux)
python3.11 -m venv fairbenchenv
source fairbenchenv/bin/activate
cmd (Windows)
python -m venv fairbenchenv
fairbenchenv\Scripts\activate.bat
- Install from source.
git clone https://github.com/humancompatible/train.git
cd train
pip install -r requirements.txt
pip install .
Usage instructions
The benchmark offers two families of datasets: Folktables and Dutch, several pre-defined constraints, and several constrained optimization algorithms: ALM (smoothed and non-smoothed), SPBM, and Switching Subgradient; we are currently working to add Stochastic Ghost within the new framework as well.
To run an experiment, run:
python run_benchmark.py --dataset <DATASET> [folktables, dutch] --task <TYPE OF CONSTRAINT> [loss, equalized_odds_pairwise, equalized_odds_vec, weight_norm] --n_runs <NUMBER OF RUNS OF EACH METHOD> --n_epochs <NUMBER OF EPOCHS PER RUN>
The constraint options are:
loss: constraint(s) on the absolute difference between the classification loss on each group and the overall classification loss;equalized_odds_pairwise: constraint(s) on the absolute difference between the positive rate between each group;equalized_odds_vec: constraint on the Positive Rate of each group as defined byfairret.NormLoss;weight_norm: constraint on the Frobenius norm of the weights and biases of each layer of the neural network.
The benchmarking code (all of which is contained in the benchmark directory) is easy to parse and extend with other datasets and constraints.
Future work
- Add more algorithms
- Add more examples from different fields where constrained training of DNNs is employed
References
If you use this work, we encourage you to cite our paper,
@inproceedings{kliachkin2026benchmarking,
title={Benchmarking Stochastic Approximation Algorithms for Fairness-Constrained Training of Deep Neural Networks},
author={Kliachkin, Andrii and Lep{\v{s}}ov{\'a}, Jana and Bareilles, Gilles and Mare{\v{c}}ek, Jakub},
booktitle={14th International Conference on Learning Representations},
url={https://arxiv.org/abs/2507.04033},
year={2026}
}
@inproceedings{kliachkin2025humancompatible,
title={humancompatible.train: Implementing Optimization Algorithms for Stochastically-Constrained Stochastic Optimization Problems},
author={Kliachkin, Andrii and Lep{\v{s}}ov{\'a}, Jana and Bareilles, Gilles and Mare{\v{c}}ek, Jakub},
booktitle={NeurIPS Workshop on Constrained Optimization for Machine Learning},
year={2025}
}
[1] Ding, Hardt & Miller et al. (2021) Retiring Adult: New Datasets for Fair Machine Learning, Curran Associates, Inc..
[2] Facchinei & Kungurtsev (2023) Stochastic Approximation for Expectation Objective and Expectation Inequality-Constrained Nonconvex Optimization, arXiv.
[3] Huang, Zhang & Alacaoglu (2025) Stochastic Smoothed Primal-Dual Algorithms for Nonconvex Optimization with Linear Inequality Constraints, arXiv.
[4] Huang & Lin (2023) Oracle Complexity of Single-Loop Switching Subgradient Methods for Non-Smooth Weakly Convex Functional Constrained Optimization, Curran Associates Inc..
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file humancompatible_train-0.3.0.1.tar.gz.
File metadata
- Download URL: humancompatible_train-0.3.0.1.tar.gz
- Upload date:
- Size: 48.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9034e74933c70ad4cc03b059d195165368f2e114e3e9b1843012f3f20c0b2b2f
|
|
| MD5 |
142d2ce903215ab98f9fa302e978df04
|
|
| BLAKE2b-256 |
582086cc69074382e334283303e0168768b3ca8d8c466bbc3942139494fe0191
|
File details
Details for the file humancompatible_train-0.3.0.1-py3-none-any.whl.
File metadata
- Download URL: humancompatible_train-0.3.0.1-py3-none-any.whl
- Upload date:
- Size: 56.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8bebf8a4beffd4aca55145811764e9403a1a7562be8b1dafccf3767890551a1b
|
|
| MD5 |
dc2095f76681cf9eb58994487c63da53
|
|
| BLAKE2b-256 |
cbafc531122f3659b5a59f8fe39c37d2c4c5e9401dba3c8d4fe8083390fe97e8
|