Skip to main content

No project description provided

Project description


NASBench-PyTorch is a PyTorch implementation of the search space NAS-Bench-101 including the training of the networks**. The original implementation is written in TensorFlow, and this projects contains some files from the original repository (in the directory nasbench_pytorch/model/).

Important: if you want to reproduce the original results, please refer to the Reproducibility section.


A PyTorch implementation of training of NAS-Bench-101 dataset: NAS-Bench-101: Towards Reproducible Neural Architecture Search. The dataset contains 423,624 unique neural networks exhaustively generated and evaluated from a fixed graph-based search space.


You need to have PyTorch installed.

You can install the package by running pip install nasbench_pytorch. The second possibility is to install from source code:

  1. Clone this repo
git clone
cd NASBench-PyTorch
  1. Install the project
pip install -e .

The file contains an example training of a network. To see the different parameters, run:

python --help

Train a network by hash

To train a network whose architecture is queried from NAS-Bench-101 using its unique hash, install the original nasbench repository. Follow the instructions in the README, note that you need to install TensorFlow. If you need TensorFlow 2.x, install this fork of the repository instead.

Then, you can get the PyTorch architecture of a network like this:

from nasbench_pytorch.model import Network as NBNetwork
from nasbench import api

nasbench_path = '$path_to_downloaded_nasbench'
nb = api.NASBench(nasbench_path)

net_hash = '$some_hash'  # you can get hashes using nasbench.hash_iterator()
m = nb.get_metrics_from_hash(net_hash)
ops = m[0]['module_operations']
adjacency = m[0]['module_adjacency']

net = NBNetwork((adjacency, ops))

Then, you can train it just like the example network in


Example architecture (picture from the original repository) archtecture


The code should closely match the TensorFlow version (including the hyperparameters), but there are some differences:

  • RMSProp implementation in TensorFlow and PyTorch is different

    • For more information refer to here and here.
    • Optionally, you can install pytorch-image-models where a TensorFlow-like RMSProp is implemented
      • pip install timm
    • Then, pass --optimizer rmsprop_tf to to use it
  • You can turn gradient clipping off by setting --grad_clip_off True

  • The original training was on TPUs, this code enables only GPU and CPU training

  • Input data augmentation methods are the same, but due to randomness they are not applied in the same manner

    • Cause: Batches and images cannot be shuffled as in the original TPU training, and the augmentation seed is also different
  • Results may still differ due to TensorFlow/PyTorch implementation differences

Refer to this issue for more information and for comparison with API results.


Modified from NASBench: A Neural Architecture Search Dataset and Benchmark. and are directly copied from the original repo. Original license can be found here.

**Please note that this repo is only used to train one possible architecture in the search space, not to generate all possible graphs and train them.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nasbench_pytorch-1.3.1.tar.gz (18.8 kB view hashes)

Uploaded Source

Built Distribution

nasbench_pytorch-1.3.1-py3-none-any.whl (20.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page