Skip to main content

A network block with built in spacial and scale decomposition.

Project description

License: MIT PyPI version arXiv

WavPool

A network block with built in spacial and scale decomposition.

Modern deep neural networks comprise many operational layers, such as dense or convolutional layers, which are often collected into blocks. In this work, we introduce a new, wavelet-transform-based network architecture that we call the multi-resolution perceptron: by adding a pooling layer, we create a new network block, the WavPool. The first step of the multi-resolution perceptron is transforming the data into its multi-resolution decomposition form by convolving the input data with filters of fixed coefficients but increasing size. Following image processing techniques, we are able to make scale and spatial information simultaneously accessible to the network without increasing the size of the data vector. WavPool outperforms a similar multilayer perceptron while using fewer parameters, and outperforms a comparable convolutional neural network by over 10% on accuracy on CIFAR-10.

This codebase contains the experimental work supporting the paper. It is to be used additional material for replication.

Installation

Our project can be installed with pip from pypi using:

pip install wavpool

This project is build with python poetry. And is our perfered method to install from source.

Commands are as follows:

pip install poetry
poetry shell
poetry init
poetry install

To install all the dependencies required for this project.

We also supply distribution files (found in \dist), or you may use the provided pyproject.toml to install with your method of choice.

Contents

Data Generators

The pytorch data generator objects for the experiments done in this paper. Wrapped to work with the training framework, but functionally unmodified. We include CIFAR-10 (cifar_generator.py), Fashion MNIST (fashion_mnist_generator.py), and MNIST (mnist_generator.py).

Training

Training loops used in the experiments.

finetune_networks.py generates a set of parameters optimial for a network/task combination.

train_model.py Executes the training loop for a network/task/parameter combination.

Models

wavpool.py Our implimentation of the novel WavPool block

vanillaCNN.py Standard two layer CNN containing 2D Convolutions, batch norms, and a dense output

vanillaMLP.py Standard two hidden layer MLP

wavelet_layer.py The MicroWav MLR analysis layer

wavMLP.py Single MicroWav layer network with an additional dense layer and output. Not included in the paper.

Notebooks

Visualizations of experiments with plotting code for plots included in the paper, and code to produce weights.

run_experiments.py

Takes a configuration and trains an model. Current execution shows the optimization and subsquentical training and testing for a WavPool over CIFAR-10, Fashion MNIST and MNIST.

Acknowledgement

We acknowledge the Deep Skies Lab as a community of multi-domain experts and collaborators who've facilitated an environment of open discussion, idea-generation, and collaboration. This community was important for the development of this project. We thank Aleksandra Ciprijanovic, Andrew Hearin, and Shubhendu Trivedi for comments on the manuscript. This manuscript has been authored by Fermi Research Alliance, LLC under Contract No.~DE-AC02-07CH11359 with the U.S.~Department of Energy, Office of Science, Office of High Energy Physics.

FERMILAB-CONF-23-278-CSAID

Citation

If you use our work or our code, we request you cite the arxiv paper!

@misc{mcdermott2023wavpool,
      title={WavPool: A New Block for Deep Neural Networks}, 
      author={Samuel D. McDermott and M. Voetberg and Brian Nord},
      year={2023},
      eprint={2306.08734},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wavpool-0.1.3.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

wavpool-0.1.3-py3-none-any.whl (17.5 kB view details)

Uploaded Python 3

File details

Details for the file wavpool-0.1.3.tar.gz.

File metadata

  • Download URL: wavpool-0.1.3.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.2 Linux/6.2.0-23-generic

File hashes

Hashes for wavpool-0.1.3.tar.gz
Algorithm Hash digest
SHA256 4e2af7d2c0682980cf7fb69be9ac4354fff0e49a6134badc33c3da2737268cb4
MD5 7375b7c6ccdb680000c5b0fb693d404d
BLAKE2b-256 4497d14f9f917f54ebec0586c02163114dd8f126211e5d6671b9edb6008a2aa2

See more details on using hashes here.

File details

Details for the file wavpool-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: wavpool-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 17.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.2 Linux/6.2.0-23-generic

File hashes

Hashes for wavpool-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 02ccc8b3da829a83e41c4f8e10deb1e2bac9319854eabe6a1999cadec8b9bc45
MD5 0e52b4aaa46a17b01602f2b7f4637172
BLAKE2b-256 6985c4611b6ea1ed8db9d8b7603b0ff3a558041dfe950f8f47592bccf76badeb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page