A performant implementation of the principle of Maximum Coding Rate Reduction (MCR2).
Project description
Maximal Coding Rate Reduction
This repository is an unofficial implementation of the following papers:
- Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction (2020)
- ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction (2021)
- More on the way.
This also serves as the host repository for the Pip package.
What is Maximal Coding Rate Reduction?
Our goal is to learn a mapping that maps the high-dimensional data that lies in a low-dimensional manifold to low-dimensional subspaces with the following three properties:
- Between-Class Discriminative: Features of samples from different classes/clusters should be highly uncorrelatedand belong to different low-dimensional linear subspaces
- Within-Class Compressible: Features of samples from the same class/cluster should be relatively correlated in a sense that they belong to a low-dimensional linear subspace
- Maximally Diverse Representation: Dimension (or variance) of features for each class/cluster should beas large as possibleas long as they stay uncorrelated from the other classes
To achieve this, we propose an objective function called Maximal Coding Rate Reduction (MCR2). In our paper, we provide not only theoretical guarantees to the desired properties upon convergence, but also practical properties such as robustness to label corruption and empirical results such as state-of-the-art unsupervised clustering performance. For more details on algorithm design, please refer to our paper.
What is ReduNet?
Our goal is to build a neural network for representation learning with the following properties:
- Interpretable: We should be able to interpret each network operator and assign precise meaning to each layer and parameter.
- Forward-Propagation Only: The network should be trained using much-more interpretable forward-propagation methods, as opposed to back-propagation which tends to create black-boxes.
- Use MCR2: The network should seek to optimize MCR2 loss function, as the purpose is distribution learning.
To achieve this, we propose a neural network architecture and algorithms called ReduNet. In our paper, we provide not only theoretical interpretations and a precise derivation of each operator in the network, but also connections to other architectures that form naturally as components of ReduNet. We also provide empirical justification for the power of ReduNet. For more details on algorithm design, please refer to our paper.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mcr2-0.0.10.tar.gz
.
File metadata
- Download URL: mcr2-0.0.10.tar.gz
- Upload date:
- Size: 8.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 013465abb516007a5a90d228b3e1ded988b4d79395c78a8daef9e101d98227b2 |
|
MD5 | 7fc875d8fc2b8306aa8422fff930929e |
|
BLAKE2b-256 | 83920f1f9bf4c988192e67e6d54ace541d33b51bd9613a6801dac9ac25dfebbb |
File details
Details for the file mcr2-0.0.10-py3-none-any.whl
.
File metadata
- Download URL: mcr2-0.0.10-py3-none-any.whl
- Upload date:
- Size: 10.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 583f8c08cdd1932f7ed202d382eacd557d2d0d71646771db5a331e53949d5312 |
|
MD5 | 83499f8d406588c9395f0ad56ab722ac |
|
BLAKE2b-256 | 4c3c0d6e17e3303a38df994ceff7ea0a1213eb318980881cf352abce98b1d0f2 |