Skip to main content

Dynamic Batching with PyTorch

Project description

TorchFold

Blog post: http://near.ai/articles/2017-09-06-PyTorch-Dynamic-Batching/

Analogous to TensorFlow Fold, implements dynamic batching with super simple interface. Replace every direct call in your computation to nn module with f.add('function name', arguments). It will construct an optimized version of computation and on f.apply will dynamically batch and execute the computation on given nn module.

Installation

We recommend using pip package manager:

pip install torchfold

Example

    f = torchfold.Fold()

    def dfs(node):
        if is_leaf(node):
            return f.add('leaf', node)
        else:
            prev = f.add('init')
            for child in children(node):
                prev = f.add('child', prev, child)
            return prev

    class Model(nn.Module):
        def __init__(self, ...):
            ...

        def leaf(self, leaf):
            ...

        def child(self, prev, child):
            ...

    res = dfs(my_tree)
    model = Model(...)
    f.apply(model, [[res]])

Project details


Release history Release notifications

This version
History Node

0.1.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
torchfold-0.1.0-py3-none-any.whl (5.4 kB) Copy SHA256 hash SHA256 Wheel py3
torchfold-0.1.0.tar.gz (4.7 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page