Skip to main content

Permanant cache.

Project description

Run batched

This is a very basic module that allows you to run a pytorch model on numpy arrays in batches. It handles the batching for you, so you can just pass in a numpy array and get a numpy array back. It also works when the input or return value is a dict of numpy arrays.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

run_batched-1.0.0.tar.gz (16.0 kB view details)

Uploaded Source

File details

Details for the file run_batched-1.0.0.tar.gz.

File metadata

  • Download URL: run_batched-1.0.0.tar.gz
  • Upload date:
  • Size: 16.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for run_batched-1.0.0.tar.gz
Algorithm Hash digest
SHA256 6220cb3a1ce25a50fc7c32c933c86c05596b625dcd98273636df57c0e4e1ecfe
MD5 05c3725931ce7dcf35d917577d884c6d
BLAKE2b-256 1af5a3f4733deac2469f09802e94f4cb40597b1c9694c809ba7490cdc58fca77

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page