A performant implementation of the principle of Maximum Coding Rate Reduction (MCR2).
Project description
Maximal Coding Rate Reduction
This repository is an unofficial implementation of the following papers:
- Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction (2020)
- ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction (2021)
- More on the way.
This also serves as the host repository for the Pip package.
What is Maximal Coding Rate Reduction?
Our goal is to learn a mapping that maps the high-dimensional data that lies in a low-dimensional manifold to low-dimensional subspaces with the following three properties:
- Between-Class Discriminative: Features of samples from different classes/clusters should be highly uncorrelatedand belong to different low-dimensional linear subspaces
- Within-Class Compressible: Features of samples from the same class/cluster should be relatively correlated in a sense that they belong to a low-dimensional linear subspace
- Maximally Diverse Representation: Dimension (or variance) of features for each class/cluster should beas large as possibleas long as they stay uncorrelated from the other classes
To achieve this, we propose an objective function called Maximal Coding Rate Reduction (MCR2). In our paper, we provide not only theoretical guarantees to the desired properties upon convergence, but also practical properties such as robustness to label corruption and empirical results such as state-of-the-art unsupervised clustering performance. For more details on algorithm design, please refer to our paper.
What is ReduNet?
Our goal is to build a neural network for representation learning with the following properties:
- Interpretable: We should be able to interpret each network operator and assign precise meaning to each layer and parameter.
- Forward-Propagation Only: The network should be trained using much-more interpretable forward-propagation methods, as opposed to back-propagation which tends to create black-boxes.
- Use MCR2: The network should seek to optimize MCR2 loss function, as the purpose is distribution learning.
To achieve this, we propose a neural network architecture and algorithms called ReduNet. In our paper, we provide not only theoretical interpretations and a precise derivation of each operator in the network, but also connections to other architectures that form naturally as components of ReduNet. We also provide empirical justification for the power of ReduNet. For more details on algorithm design, please refer to our paper.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mcr2-1.0.0.tar.gz
.
File metadata
- Download URL: mcr2-1.0.0.tar.gz
- Upload date:
- Size: 7.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f5e606d0a3bfbd8ecef79a019f6e453f396589ae9a43f3a53b137e9475234d81 |
|
MD5 | 3a2b06225fb4fa84494f6700309bffb9 |
|
BLAKE2b-256 | 89524216db3928c3502265c5624aceea9bab01b29e3fb6de01239a889b6acefc |
File details
Details for the file mcr2-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: mcr2-1.0.0-py3-none-any.whl
- Upload date:
- Size: 8.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b21e271372107be62b4e4afb621ea9c87c86a801ae9a370996bc461a5a4bbe1d |
|
MD5 | f73a2783d1cb93a749301962e8ee5355 |
|
BLAKE2b-256 | 64b220aab692225c8f4db41ce8949a7f475869dae403e602152e84a5d23e74f9 |