Skip to main content

Running/online statistics for PyTorch

Project description

pytorch_runstats

Running/online statistics for PyTorch.

Documentation Status

torch_runstats implements memory-efficient online reductions on tensors.

Notable features:

  • Arbitrary sample shapes beyond single scalars
  • Reduction over arbitrary dimensions of each sample
  • "Batched"/"binned" reduction into multiple running tallies using a per-sample bin index. This can be useful, for example, in accumulating statistics over samples by some kind of "type" index or for accumulating statistics per-graph in a pytorch_geometric-like batching scheme. (This feature uses and is similar to torch_scatter.)

Note: the implementations currently heavily uses in-place operations for peformance and memory efficiency. This probably doesn't play nice with the autograd engine — this is currently likely the wrong library for accumulating running statistics you want to backward through. (See TorchMetrics for a possible alternative.)

For more information, please see the docs.

Install

torch_runstats requires PyTorch and torch_scatter, but neither is specified in install_requires for pip since both require manual installation for correct CUDA compatability.

License

pytorch_runstats is distributed under an MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_runstats-0.1.0.tar.gz (5.1 kB view hashes)

Uploaded Source

Built Distribution

torch_runstats-0.1.0-py3-none-any.whl (6.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page