Skip to main content

nb is nerual network builder for quick network prototyping

Project description

NB

Nenural network Blocks (aka: NB, or neural network builder). This library provides massive fancy blocks for you for quick import to build your powerful. Some SOTA tricks and connections such as CSP, ASFF, Attention, BaseConv, Hardswish, all included for quick prototype your model.

nb is an idea comes from engineering, we build model with some common blocks, we exploring new ideas with SOTA tricks, but all those thing can be gathered into one single place, and for model quick design and prototyping.

this project is under construct for now, I will update it quickly once I found some new blocks that really works in model. Also, every single updated block will be recorded in updates.

Install

nb can be installed from PIP, remember the name is nbnb:

sudo pip3 install nbnb

Usage

Here is an example of using NB to build YoloV5!

updates: We have another YoloV5-ASFF version added in example!

import torch
from torch import nn
from nb.torch.blocks.bottleneck_blocks import SimBottleneckCSP
from nb.torch.blocks.trans_blocks import Focus
from nb.torch.blocks.head_blocks import SPP
from nb.torch.blocks.conv_blocks import ConvBase
from nb.torch.utils import device

class YoloV5(nn.Module):

    def __init__(self, num_cls=80, ch=3, anchors=None):
        super(YoloV5, self).__init__()
        assert anchors != None, 'anchor must be provided'

        # divid by
        cd = 2
        wd = 3

        self.focus = Focus(ch, 64//cd)
        self.conv1 = ConvBase(64//cd, 128//cd, 3, 2)
        self.csp1 = SimBottleneckCSP(128//cd, 128//cd, n=3//wd)
        self.conv2 = ConvBase(128//cd, 256//cd, 3, 2)
        self.csp2 = SimBottleneckCSP(256//cd, 256//cd, n=9//wd)
        self.conv3 = ConvBase(256//cd, 512//cd, 3, 2)
        self.csp3 = SimBottleneckCSP(512//cd, 512//cd, n=9//wd)
        self.conv4 = ConvBase(512//cd, 1024//cd, 3, 2)
        self.spp = SPP(1024//cd, 1024//cd)
        self.csp4 = SimBottleneckCSP(1024//cd, 1024//cd, n=3//wd, shortcut=False)

        # PANet
        self.conv5 = ConvBase(1024//cd, 512//cd)
        self.up1 = nn.Upsample(scale_factor=2)
        self.csp5 = SimBottleneckCSP(1024//cd, 512//cd, n=3//wd, shortcut=False)

        self.conv6 = ConvBase(512//cd, 256//cd)
        self.up2 = nn.Upsample(scale_factor=2)
        self.csp6 = SimBottleneckCSP(512//cd, 256//cd, n=3//wd, shortcut=False)

        self.conv7 = ConvBase(256//cd, 256//cd, 3, 2)
        self.csp7 = SimBottleneckCSP(512//cd, 512//cd, n=3//wd, shortcut=False)

        self.conv8 = ConvBase(512//cd, 512//cd, 3, 2)
        self.csp8 = SimBottleneckCSP(512//cd, 1024//cd, n=3//wd, shortcut=False)

    def _build_backbone(self, x):
        x = self.focus(x)
        x = self.conv1(x)
        x = self.csp1(x)
        x_p3 = self.conv2(x)  # P3
        x = self.csp2(x_p3)
        x_p4 = self.conv3(x)  # P4
        x = self.csp3(x_p4)
        x_p5 = self.conv4(x)  # P5
        x = self.spp(x_p5)
        x = self.csp4(x)
        return x_p3, x_p4, x_p5, x

    def _build_head(self, p3, p4, p5, feas):
        h_p5 = self.conv5(feas)  # head P5
        x = self.up1(h_p5)
        x_concat = torch.cat([x, p4], dim=1)
        x = self.csp5(x_concat)

        h_p4 = self.conv6(x)  # head P4
        x = self.up2(h_p4)
        x_concat = torch.cat([x, p3], dim=1)
        x_small = self.csp6(x_concat)

        x = self.conv7(x_small)
        x_concat = torch.cat([x, h_p4], dim=1)
        x_medium = self.csp7(x_concat)

        x = self.conv8(x_medium)
        x_concat = torch.cat([x, h_p5], dim=1)
        x_large = self.csp8(x)
        return x_small, x_medium, x_large

    def forward(self, x):
        p3, p4, p5, feas = self._build_backbone(x)
        xs, xm, xl = self._build_head(p3, p4, p5, feas)
        return xs, xm, xl

A simple example to build a layer of conv:

from nb.torch.base.conv_block import ConvBase
a = ConvBase(128, 256, 3, 1, 2, norm_cfg=dict(type="BN"), act_cfg=dict(type="Hardswish"))

Be note that, the reason for us using cfg to specific norm and activation is for users dynamically switch their configuration of model in yaml format rather than hard code it.

A simple example of using GhostNet:

from nb.torch.backbones.ghostnet import GhostNet

m = GhostNet(num_classes=8)

# if you want FPN output
m = GhostNet(fpn_levels=[4, 5, 6])

A simple example of using MobilenetV3:

from nb.torch.backbones.mobilenetv3_new import MobilenetV3_Small

Updates

  • 2021.01.14: Adding SiLU introduced from pytorch 1.7. And now you can build a activation layer by using:

    from nb.torch.base import build_activation_layer
    act = build_activation_layer(act_cfg=dict(type='SiLU'))
    

    Also PANet module also provided now. BiFPN is on the way. We will also provide more examples on how to using it!

  • 2020.09.28: ASFF module added inside nb. We have a ASFF design version of YoloV5 now! Some experiment will add here once we confirm ASFF module enhance the model performance.

  • 2020.09.22: New backbone of Ghostnet and MobilenetV3 included. Both of them can be used to replace any of your application's backbone.

  • 2020.09.14: We release a primary version of 0.04, which you can build a simple YoloV5 with nb easily!

    pip install nbnb
    
  • 2020.09.12: New backbone SpineNet added:

    SpineNet is a backbone model specific for detection, it's a backbone but can do FPN's thing!! More info pls reference google's paper link.

    from nb.torch.bakbones.spinenet import SpineNet
    
    model = SpineNet()
    
  • 2020.09.11: New added blocks:

    resnet.Bottleneck
    resnet.BasicBlock
    
    ConvBase
    

Support Matrix

We list all conv and block support in nb here:

Copyright

@Lucas Jin all rights reserved.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nbnb-0.0.8.tar.gz (25.6 kB view details)

Uploaded Source

File details

Details for the file nbnb-0.0.8.tar.gz.

File metadata

  • Download URL: nbnb-0.0.8.tar.gz
  • Upload date:
  • Size: 25.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.6.10

File hashes

Hashes for nbnb-0.0.8.tar.gz
Algorithm Hash digest
SHA256 16294afb127a6493deff21e0935f5e8b3ea1671bcc42143bd0641bd03f5e2e71
MD5 311e7a0af42de3b15bcc1013fb958947
BLAKE2b-256 81b420493440328d4414b30ed3e3be0d4975a4caed48a81b1653c03d0d82977a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page