Skip to main content

A collection of tools for neural compression enthusiasts.

Project description

NeuralCompression

LICENSE Linting and Tests

What's New

About

NeuralCompression is a Python repository dedicated to research of neural networks that compress data. The repository includes tools such as JAX-based entropy coders, image compression models, video compression models, and metrics for image and video evaluation.

NeuralCompression is alpha software. The project is under active development. The API will change as we make releases, potentially breaking backwards compatibility.

Installation

NeuralCompression is a project currently under development. You can install the repository in development mode.

Development Installation

To match your local environment to the test environment, first run

pip install -r requirements.txt

Then, you can install the package in development mode by running

pip install -e .

If you are not interested in extending the package and running tests, then you only need to apply the second step to install.

Repository Structure

We use a 2-tier repository structure. The neuralcompression package contains a core set of tools for doing neural compression research. Code committed to the core package requires stricter linting, high code quality, and rigorous review. The projects folder contains code for reproducing papers and training baselines. Code in this folder is not linted aggressively, we don't enforce type annotations, and it's okay to omit unit tests.

The 2-tier structure enables rapid iteration and reproduction via code in projects that is built on a backbone of high-quality code in neuralcompression.

neuralcompression

  • neuralcompression - base package
    • data - PyTorch data loaders for various data sets
    • entropy_coders - lossless compression algorithms in JAX
      • craystack - an implementation of the rANS algorithm with the craystack API
    • functional - methods for image warping, information cost, etc.
    • layers - building blocks for compression models
    • metrics - torchmetrics classes for assessing model performance
    • models - complete compression models

projects

Getting Started

For an example of package usage, see the Scale Hyperprior for an example of how to train an image compression model in PyTorch Lightning. See DVC for a video compression example.

Contributions

Please read our CONTRIBUTING guide and our CODE_OF_CONDUCT prior to submitting a pull request.

We test all pull requests. We rely on this for reviews, so please make sure any new code is tested. Tests for neuralcompression go in the tests folder in the root of the repository. Tests for individual projects go in those projects' own tests folder.

We use black for formatting, isorst for import sorting, flake8 for linting, mypy for type checking. We enforce these on the neuralcompression package, but not in the projects folder.

License

NeuralCompression is MIT licensed, as found in the LICENSE file.

Cite

If you find NeuralCompression useful in your work, feel free to cite

@misc{muckley2021neuralcompression,
    author={Matthew Muckley and Jordan Juravsky and Daniel Severo and Mannat Singh and Quentin Duval and Karen Ullrich},
    title={NeuralCompression},
    howpublished={\url{https://github.com/facebookresearch/NeuralCompression}},
    year={2021}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuralcompression-0.1.0.post20210719.tar.gz (40.8 kB view hashes)

Uploaded Source

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page