Constructor for pytorch models.
Project description
model_constructor
Constructor to create pytorch model.
_
Install
pip install model-constructor
How to use
model = Net()
model = Net()
Resnet as example
Lets create resnet18 and resnet34 (default Net() is resnet18()).
resnet18 = Net(block=BasicBlock, blocks=[2, 2, 2, 2])
resnet34 = Net(block=BasicBlock, blocks=[3, 4, 6, 3])
Predefined Resnet models - 18, 34, 50.
from model_constructor.resnet import *
model = resnet34(num_classes=10)
model = resnet50(num_classes=10)
Predefined Xresnet from fastai 1.
This ie simplified version from fastai v1. I did refactoring for better understand and experime with models. For example, change activation funtions, different stems, batchnorm and activation order etc. In v2 much powerfull realisation.
from model_constructor.xresnet import *
model = xresnet50()
Some examples
We can experiment with models by changing some parts of model. Here only base functionality, but it can be easily extanded.
Here is some examples:
Custom stem
Stem with 3 conv layers
model = Net(stem=partial(Stem, stem_sizes=[32, 32]))
model.stem
Stem(
sizes: [3, 32, 32, 64]
(conv0): ConvLayer(
(conv): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): ReLU(inplace=True)
)
(conv1): ConvLayer(
(conv): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): ReLU(inplace=True)
)
(conv2): ConvLayer(
(conv): Conv2d(32, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): ReLU(inplace=True)
)
(pool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
)
model = Net(stem_sizes=[32, 64])
model.stem
Stem(
sizes: [3, 32, 64, 64]
(conv0): ConvLayer(
(conv): Conv2d(3, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): ReLU(inplace=True)
)
(conv1): ConvLayer(
(conv): Conv2d(32, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): ReLU(inplace=True)
)
(conv2): ConvLayer(
(conv): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): ReLU(inplace=True)
)
(pool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
)
Activation function before Normalization
model = Net(bn_1st=False)
model.stem
Stem(
sizes: [3, 64]
(conv0): ConvLayer(
(conv): Conv2d(3, 64, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): ReLU(inplace=True)
)
(pool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
)
Change activation function
act_fn = nn.LeakyReLU(inplace=True)
model = Net(act_fn=act_fn)
model.stem
Stem(
sizes: [3, 64]
(conv0): ConvLayer(
(conv): Conv2d(3, 64, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(act_fn): LeakyReLU(negative_slope=0.01, inplace=True)
)
(pool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
)
model.body.layer_0.block_0.conv.conv_0
ConvLayer(
(conv): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(act_fn): LeakyReLU(negative_slope=0.01, inplace=True)
(bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for model_constructor-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6e7e9e4d64a449a728e5016141847af643b578d32b0b0561b37fe2d89cbd76dd |
|
MD5 | 25a611313af92dafaf08077860c44c3b |
|
BLAKE2b-256 | 77f2805cf0caa761ea82849432822211c834ff59e564e1241d0d7cd91403553c |