A fair loss function
Project description
A fair PyTorch loss function
The goal of this loss function is to take fairness into account during the training of a PyTorch model. It works by adding a fairness measure to a regular loss value, following this equation:
Installation
pip install fair-loss
Example
import torch
import torch.nn.functional as F
import numpy as np
from fair_loss import FairLoss, accuracy
model = torch.nn.Sequential(torch.nn.Linear(5, 1), torch.nn.ReLU())
data = np.random.randint(5, size=(100, 5)).astype("float")
data = torch.tensor(data, requires_grad=True, dtype=torch.float)
y_true = np.random.randint(5, size=(100, 1)).astype("float")
y_true = torch.tensor(y_true, requires_grad=True)
y_pred = model(data)
# Let's say the sensitive attribute is in the second dimension
dim = 1
loss = F.mse_loss(y_pred, y_true)
loss = FairLoss(data[:, dim], loss, y_pred, y_true, accuracy)
loss.backward()
Documentation
See the documentation.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
Close
Hashes for fair_loss-0.3-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c407ce321c0b4fb1768ce1ea594a3c676fcbb6bae29652f3cc44c497bd4bb347 |
|
MD5 | 67a01ebf1445534244fa85d91cd200a4 |
|
BLAKE2b-256 | bd1a7472ed64c41a6232b91109adf6c8764efd74d8902ba19a9e286759b35290 |