Print each layer forward and backward time of PyTorch models, ref: https://discuss.pytorch.org/t/how-to-split-backward-process-wrt-each-layer-of-neural-network/7190
Project description
# Split_layer This is a simple package to print each layer forward and backward time of PyTorch models.
## Usage
You can use this tool by three steps:
Install split_layer by running pip3 install split_layer -U –user
Find the file which defines the structure of a Network. Add the following code:
from split_layer import split_layer_dec
@split_layer_dec(__file__) class Net():
Notice: Make sure the forward function input parameter and itermediate output should be x. For example: x = F.relu(self.conv1(x)) return x
Replace loss.backward() with something like net.backward(outputs). The you can run your training code as usual.
## Others It is built according to the accepted answer of this [question](https://discuss.pytorch.org/t/how-to-split-backward-process-wrt-each-layer-of-neural-network/7190’). Now, it is not flexible enough, and DO NOT support DP or DDP models. We will develop it further in the future.
## Requirements Make sure inspect and torch has been installed.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for split_layer-0.0.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 42bf3a7d43fd8e4aba58e269425b5f0cd644dabf849c0be2c2a718a28c0bef8e |
|
MD5 | 3413b778772992666d93094c5dfe1224 |
|
BLAKE2b-256 | 2b94465bdf27f36a7a2441f61c4b67d446352373b3950f441fb909b89879bc44 |