HL Gauss - Pytorch
Project description
HL Gauss - Pytorch (wip)
The proposed Gaussian Histogram Loss (HL-Gauss) proposed by Imani et al. with a few convenient wrappers, in Pytorch.
A team at Deepmind wrote a paper with a lot of positive findings for its use in value-based RL.
Install
$ pip install hl-gauss-pytorch
Usage
The HLGaussLoss module as defined in Appendix A. of the Stop Regressing paper
import torch
from hl_gauss_pytorch import HLGaussLoss
hl_gauss = HLGaussLoss(
min_value = 0.,
max_value = 5.,
num_bins = 32,
sigma = 0.5
)
logits = torch.randn(3, 16, 32).requires_grad_()
targets = torch.randint(0, 5, (3, 16)).float()
loss = hl_gauss(logits, targets)
loss.backward()
# after much training
pred_target = hl_gauss(logits) # (3, 16)
For a convenient layer that predicts from embedding to logits, import HLGaussLayer
import torch
from hl_gauss_pytorch import HLGaussLayer
hl_gauss_layer = HLGaussLayer(
dim = 256, # input embedding dimension
min_value = 0.,
max_value = 5.,
num_bins = 32,
sigma = 0.5,
)
embed = torch.randn(7, 256)
targets = torch.randint(0, 5, (7,)).float()
loss = hl_gauss_layer(embed, targets)
loss.backward()
# after much training
pred_target = hl_gauss_layer(embed) # (7,)
For ablating the proposal, you can make the HLGaussLayer fall back to regular regression by setting disable = True, keeping the code above unchanged
HLGaussLayer(..., disable = True)
Citations
@article{Imani2024InvestigatingTH,
title = {Investigating the Histogram Loss in Regression},
author = {Ehsan Imani and Kai Luedemann and Sam Scholnick-Hughes and Esraa Elelimy and Martha White},
journal = {ArXiv},
year = {2024},
volume = {abs/2402.13425},
url = {https://api.semanticscholar.org/CorpusID:267770096}
}
@inproceedings{Imani2018ImprovingRP,
title = {Improving Regression Performance with Distributional Losses},
author = {Ehsan Imani and Martha White},
booktitle = {International Conference on Machine Learning},
year = {2018},
url = {https://api.semanticscholar.org/CorpusID:48365278}
}
@article{Farebrother2024StopRT,
title = {Stop Regressing: Training Value Functions via Classification for Scalable Deep RL},
author = {Jesse Farebrother and Jordi Orbay and Quan Ho Vuong and Adrien Ali Taiga and Yevgen Chebotar and Ted Xiao and Alex Irpan and Sergey Levine and Pablo Samuel Castro and Aleksandra Faust and Aviral Kumar and Rishabh Agarwal},
journal = {ArXiv},
year = {2024},
volume = {abs/2403.03950},
url = {https://api.semanticscholar.org/CorpusID:268253088}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hl_gauss_pytorch-0.0.4.tar.gz.
File metadata
- Download URL: hl_gauss_pytorch-0.0.4.tar.gz
- Upload date:
- Size: 135.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0683aeba66942d3646dde85972d958e4ee68ebad83319ccd97b29d92d996a048
|
|
| MD5 |
6a2c8dfdae7b0d13b73639016071f5a7
|
|
| BLAKE2b-256 |
bb5632cbc75a97d12268dd1d587e3f75560dd0976d0b5edd6a5bb20e6bcdafe8
|
File details
Details for the file hl_gauss_pytorch-0.0.4-py3-none-any.whl.
File metadata
- Download URL: hl_gauss_pytorch-0.0.4-py3-none-any.whl
- Upload date:
- Size: 5.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1b035dc5784cd4d9c31b4f842c3dd569d5ca3f9dc6a5b630c86b3bedd269d617
|
|
| MD5 |
2c120f036a777fef519bdea1d20e298b
|
|
| BLAKE2b-256 |
3ffaae7155d42bd54abcc1bda4eac43e1d19809c73618f22aacf0a5fe3653737
|