Skip to main content

HL Gauss - Pytorch

Project description

HL Gauss - Pytorch

The Gaussian Histogram Loss (HL-Gauss) proposed by Imani et al. with a few convenient wrappers, in Pytorch.

A team at Deepmind wrote a paper with a lot of positive findings for its use in value-based RL.

Put into action here, seems to work well

Install

$ pip install hl-gauss-pytorch

Usage

The HLGaussLoss module as defined in Appendix A. of the Stop Regressing paper

import torch
from hl_gauss_pytorch import HLGaussLoss

hl_gauss = HLGaussLoss(
    min_value = 0.,
    max_value = 5.,
    num_bins = 32,
    sigma = 0.5,
    clamp_to_range = True # this was added because if any values fall outside of the bins, the loss is 0 with the current logic
)

logits = torch.randn(3, 16, 32).requires_grad_()
targets = torch.randint(0, 5, (3, 16)).float()

loss = hl_gauss(logits, targets)
loss.backward()

# after much training

pred_target = hl_gauss(logits) # (3, 16)

For a convenient layer that predicts from embedding to logits, import HLGaussLayer

import torch
from hl_gauss_pytorch import HLGaussLayer

hl_gauss_layer = HLGaussLayer(
    dim = 256, # input embedding dimension
    hl_gauss_loss = dict(
        min_value = 0.,
        max_value = 5.,
        num_bins = 32,
        sigma = 0.5,
    )
)

embed = torch.randn(7, 256)
targets = torch.randint(0, 5, (7,)).float()

loss = hl_gauss_layer(embed, targets)
loss.backward()

# after much training

pred_target = hl_gauss_layer(embed) # (7,)

For ablating the proposal, you can make the HLGaussLayer fall back to regular regression by setting use_regression = True, keeping the code above unchanged

HLGaussLayer(..., use_regression = True)

Citations

@article{Imani2024InvestigatingTH,
    title   = {Investigating the Histogram Loss in Regression},
    author  = {Ehsan Imani and Kai Luedemann and Sam Scholnick-Hughes and Esraa Elelimy and Martha White},
    journal = {ArXiv},
    year    = {2024},
    volume  = {abs/2402.13425},
    url     = {https://api.semanticscholar.org/CorpusID:267770096}
}
@inproceedings{Imani2018ImprovingRP,
    title   = {Improving Regression Performance with Distributional Losses},
    author  = {Ehsan Imani and Martha White},
    booktitle = {International Conference on Machine Learning},
    year    = {2018},
    url     = {https://api.semanticscholar.org/CorpusID:48365278}
}
@article{Farebrother2024StopRT,
    title   = {Stop Regressing: Training Value Functions via Classification for Scalable Deep RL},
    author  = {Jesse Farebrother and Jordi Orbay and Quan Ho Vuong and Adrien Ali Taiga and Yevgen Chebotar and Ted Xiao and Alex Irpan and Sergey Levine and Pablo Samuel Castro and Aleksandra Faust and Aviral Kumar and Rishabh Agarwal},
    journal = {ArXiv},
    year   = {2024},
    volume = {abs/2403.03950},
    url    = {https://api.semanticscholar.org/CorpusID:268253088}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hl_gauss_pytorch-0.1.17.tar.gz (138.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hl_gauss_pytorch-0.1.17-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file hl_gauss_pytorch-0.1.17.tar.gz.

File metadata

  • Download URL: hl_gauss_pytorch-0.1.17.tar.gz
  • Upload date:
  • Size: 138.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.21

File hashes

Hashes for hl_gauss_pytorch-0.1.17.tar.gz
Algorithm Hash digest
SHA256 79cc900ccb1b024202b2c7ec60d0bf72f81361fe187b95502b7decae1d41fbe2
MD5 c6832cfd698776d80cf23e3c2a94c403
BLAKE2b-256 f6b1926f17c0cfb11871476c7f444b9ac2492344c68cc38ce88d2d98a08b677d

See more details on using hashes here.

File details

Details for the file hl_gauss_pytorch-0.1.17-py3-none-any.whl.

File metadata

File hashes

Hashes for hl_gauss_pytorch-0.1.17-py3-none-any.whl
Algorithm Hash digest
SHA256 7aa705eeff506ef815e16eca5b12109f11bf09258ce610255cb32701c319f15f
MD5 dd7be93c8f70a91978877a1f18e84075
BLAKE2b-256 b87e94a17389c7e598e7d9f9c39383a4d65dea4b2f8bda3013abc3b8c651680f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page