Skip to main content

Print each layer forward and backward time of PyTorch models, ref: https://discuss.pytorch.org/t/how-to-split-backward-process-wrt-each-layer-of-neural-network/7190

Project description

# Split_layer This is a simple package to print each layer forward and backward time of PyTorch models.

## Usage

You can use this tool by three steps:

  1. Install split_layer by running pip3 install split_layer -U –user

  2. Find the file which defines the structure of a Network. Add the following code:

from split_layer import split_layer_dec

@split_layer_dec(__file__) class Net():

Notice: Make sure the forward function input parameter and itermediate output should be x. For example: x = F.relu(self.conv1(x)) return x

  1. Replace loss.backward() with something like net.backward(outputs). The you can run your training code as usual.

## Others It is built according to the accepted answer of this [question](https://discuss.pytorch.org/t/how-to-split-backward-process-wrt-each-layer-of-neural-network/7190’). Now, it is not flexible enough, and DO NOT support DP or DDP models. We will develop it further in the future.

## Requirements Make sure inspect and torch has been installed.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

split_layer-0.0.10.tar.gz (2.8 kB view hashes)

Uploaded Source

Built Distribution

split_layer-0.0.10-py3-none-any.whl (6.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page