Skip to main content

No project description provided

Project description

NASBench-PyTorch

NASBench-PyTorch is a PyTorch implementation of the search space NAS-Bench-101 including the training of the networks**. The original implementation is written in TensorFlow, and this projects contains some files from the original repository (in the directory nasbench_pytorch/model/).

Important: if you want to reproduce the original results, please refer to the Reproducibility section.

Overview

A PyTorch implementation of training of NAS-Bench-101 dataset: NAS-Bench-101: Towards Reproducible Neural Architecture Search. The dataset contains 423,624 unique neural networks exhaustively generated and evaluated from a fixed graph-based search space.

Usage

You need to have PyTorch installed.

You can install the package by running pip install nasbench_pytorch. The second possibility is to install from source code:

  1. Clone this repo
git clone https://github.com/romulus0914/NASBench-PyTorch
cd NASBench-PyTorch
  1. Install the project
pip install -e .

The file main.py contains an example training of a network. To see the different parameters, run:

python main.py --help

Train a network by hash

To train a network whose architecture is queried from NAS-Bench-101 using its unique hash, install the original nasbench repository. Follow the instructions in the README, note that you need to install TensorFlow. If you need TensorFlow 2.x, install this fork of the repository instead.

Then, you can get the PyTorch architecture of a network like this:

from nasbench_pytorch.model import Network as NBNetwork
from nasbench import api


nasbench_path = '$path_to_downloaded_nasbench'
nb = api.NASBench(nasbench_path)

net_hash = '$some_hash'  # you can get hashes using nasbench.hash_iterator()
m = nb.get_metrics_from_hash(net_hash)
ops = m[0]['module_operations']
adjacency = m[0]['module_adjacency']

net = NBNetwork((adjacency, ops))

Then, you can train it just like the example network in main.py.

Architecture

Example architecture (picture from the original repository) archtecture

Reproducibility

The code should closely match the TensorFlow version (including the hyperparameters), but there are some differences:

  • RMSProp implementation in TensorFlow and PyTorch is different

    • For more information refer to here and here.
    • Optionally, you can install pytorch-image-models where a TensorFlow-like RMSProp is implemented
      • pip install timm
    • Then, pass --optimizer rmsprop_tf to main.py to use it
  • You can turn gradient clipping off by setting --grad_clip_off True

  • The original training was on TPUs, this code enables only GPU and CPU training

  • Input data augmentation methods are the same, but due to randomness they are not applied in the same manner

    • Cause: Batches and images cannot be shuffled as in the original TPU training, and the augmentation seed is also different
  • Results may still differ due to TensorFlow/PyTorch implementation differences

Refer to this issue for more information and for comparison with API results.

Disclaimer

Modified from NASBench: A Neural Architecture Search Dataset and Benchmark. graph_util.py and model_spec.py are directly copied from the original repo. Original license can be found here.

**Please note that this repo is only used to train one possible architecture in the search space, not to generate all possible graphs and train them.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nasbench_pytorch-1.3.1.tar.gz (18.8 kB view details)

Uploaded Source

Built Distribution

nasbench_pytorch-1.3.1-py3-none-any.whl (20.3 kB view details)

Uploaded Python 3

File details

Details for the file nasbench_pytorch-1.3.1.tar.gz.

File metadata

  • Download URL: nasbench_pytorch-1.3.1.tar.gz
  • Upload date:
  • Size: 18.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for nasbench_pytorch-1.3.1.tar.gz
Algorithm Hash digest
SHA256 a131daebe57af25b745c443ddfd3e6163fa8ada445c0c671437fc0e520492d62
MD5 b01e1b657d6883324d6398d3f64f8d75
BLAKE2b-256 d328f3f4e1695533b6224d5701995b68987acb250f61da23958331fa2898d206

See more details on using hashes here.

File details

Details for the file nasbench_pytorch-1.3.1-py3-none-any.whl.

File metadata

File hashes

Hashes for nasbench_pytorch-1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6908b42c88ae288ec048f856a59a96c4792d6ef8ed40e8a07168e337e103fa6a
MD5 72d6a3183ef03a551faf70d7d5a9da94
BLAKE2b-256 11c18bcb513fccfceff454499f342a023d8e9d7c963d79265d31fe32201ebc16

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page