Skip to main content

AugShuffleNet: Communicate More, Compute Less - Pytorch

Project description

AugShuffleNet: Communicate More, Compute Less

AugShuffle1_0x has slightly different channels than the paper due to constraints in of the default r.

See: https://arxiv.org/abs/2203.06589

Usage

import torch
from augshufflenet_pytorch import AugShuffleNet0_5x, AugShuffleNet1_0x, AugShuffleNet1_5x, AugShuffleNet


model = AugShuffleNet0_5x(input_channels=3)
x = model(torch.randn(1, 3, 64, 64)) # [1, 192]

# Equivalent to 0_5x
model = AugShuffleNet(stages_repeats=[3, 7, 3], stages_out_channels=[24, 48, 96, 192], input_channels=3, r=0.375)
x = model(torch.randn(1, 3, 64, 64)) # [1, 192]

NOTE: each of the int(out_channels * r) & out_channels putneeds to be divisible by 2

Citation

@misc{ye_augshufflenet_2022,
	title = {{AugShuffleNet}: {Communicate} {More}, {Compute} {Less}},
	shorttitle = {{AugShuffleNet}},
	url = {http://arxiv.org/abs/2203.06589},
	doi = {10.48550/arXiv.2203.06589},
	abstract = {As a remarkable compact model, ShuffleNetV2 offers a good example to design efficient ConvNets but its limit is rarely noticed. In this paper, we rethink the design pattern of ShuffleNetV2 and find that the channel-wise redundancy problem still constrains the efficiency improvement of Shuffle block in the wider ShuffleNetV2. To resolve this issue, we propose another augmented variant of shuffle block in the form of bottleneck-like structure and more implicit short connections. To verify the effectiveness of this building block, we further build a more powerful and efficient model family, termed as AugShuffleNets. Evaluated on the CIFAR-10 and CIFAR-100 datasets, AugShuffleNet consistently outperforms ShuffleNetV2 in terms of accuracy with less computational cost and fewer parameter count.},
	urldate = {2023-06-09},
	publisher = {arXiv},
	author = {Ye, Longqing},
	month = aug,
	year = {2022},
	note = {arXiv:2203.06589 [cs]},
	keywords = {Computer Science - Computer Vision and Pattern Recognition, Computer Science - Machine Learning},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

augshufflenet-pytorch-0.0.2.tar.gz (9.2 kB view details)

Uploaded Source

File details

Details for the file augshufflenet-pytorch-0.0.2.tar.gz.

File metadata

  • Download URL: augshufflenet-pytorch-0.0.2.tar.gz
  • Upload date:
  • Size: 9.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.6

File hashes

Hashes for augshufflenet-pytorch-0.0.2.tar.gz
Algorithm Hash digest
SHA256 ebc60812de5935f68d2f3aa208e7d35bacf7cd025cdce805b331b43e0021ce1b
MD5 a76602bcb6fe33537bd7d5ffc06b4b47
BLAKE2b-256 58b208103c308fd90c43e0389c8a36affbff3b003ddc36e3927810241c19165b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page