Vae disentanglement framework built with pytorch lightning.
Project description
🧶 Disent
⚠️ W.I.P
A modular disentangled representation learning framework for pytorch
Visit the docs for more info.
Overview
Disent is a modular disentangled representation learning framework for auto-encoders, built upon pytorch-lightning, with its early roots in Google's tensorflow based disentanglement-lib. This framework consists of various composable components that can be used to build and benchmark disentanglement pipelines.
The name of the framework is derived from both disentanglement and scientific dissent.
Citing Disent
Please use the following citation if you use Disent in your research:
@Misc{Michlo2021Disent,
author = {Nathan Juraj Michlo},
title = {Disent - A modular disentangled representation learning framework for pytorch},
howpublished = {Github},
year = {2021},
url = {https://github.com/nmichlo/disent}
}
Getting Started
WARNING: Disent is still under active development. Features and APIs are not considered stable, but should be expected to change! A very limited set of tests currently exist which will be expanded upon in time.
The easiest way to use disent is by running experiements/hydra_system.py
and changing the root config in experiements/config/config.yaml
. Configurations are managed with Hydra Config
Pypi:
-
Install with:
pip install disent
(This will most likely be outdated) -
Visit the docs!
Source:
-
Clone with:
git clone --branch dev https://github.com/nmichlo/disent.git
-
Change your working directory to the root of the repo:
cd disent
-
Install the requirements for python 3.8 with
pip3 install -r requirements.txt
-
Run the default experiment after configuring
experiments/config/config.yaml
by runningPYTHONPATH=. python3 experiments/run.py
Features
Disent includes implementations of modules, metrics and datasets from various papers. However modules marked with a "🧵" are newly introduced in disent for nmichlo's MSc. research!
Frameworks
- Unsupervised:
- Weakly Supervised:
- Ada-GVAE
AdaVae(..., average_mode='gvae')
Usually better than the Ada-ML-VAE - Ada-ML-VAE
AdaVae(..., average_mode='ml-vae')
- Ada-GVAE
- Supervised:
- Experimental:
- 🧵 Ada-TVAE
- various others not worth mentioning
Many popular disentanglement frameworks still need to be added, please submit an issue if you have a request for an additional framework.
- todo:
- FactorVAE
- InfoVAE
- BetaTCVAE
- DIPVAE
- GroupVAE
- MLVAE
Metrics
- Disentanglement:
- FactorVAE Score
- DCI
- MIG
- SAP
- Unsupervised Scores
- 🧵 Flatness Score
Some popular metrics still need to be added, please submit an issue if you wish to add your own or you have a request for an additional metric.
Datasets:
Various common datasets used in disentanglement research are implemented, as well as new sythetic datasets that are generated programatically on the fly. These are convenient and lightweight, not requiring storage space.
-
Ground Truth:
- Cars3D
- dSprites
- MPI3D
- SmallNORB
- Shapes3D
-
Ground Truth Non-Overlapping (Synthetic):
- 🧵 XYBlocks: 3 blocks of decreasing size that move across a grid. Blocks can be one of three colors R, G, B. if a smaller block overlaps a larger one and is the same color, the block is xor'd to black.
- 🧵 XYSquares: 3 squares (R, G, B) that move across a non-overlapping grid. Obervations have no channel-wise loss overlap.
- 🧵 XYObject: A simplistic version of dSprites with a single square.
Input Transforms + Input/Target Augmentations
- Input based transforms are supported.
- Input and Target CPU and GPU based augmentations are supported.
Why?
-
Created as part of my Computer Science MSc scheduled for completion in 2021.
-
I needed custom high quality implementations of various VAE's.
-
A pytorch version of disentanglement_lib.
-
I didn't have time to wait for Weakly-Supervised Disentanglement Without Compromises to release their code as part of disentanglement_lib. (As of September 2020 it has been released, but has unresolved discrepencies).
-
disentanglement_lib still uses outdated Tensorflow 1.0, and the flow of data is unintuitive because of its use of Gin Config.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for disent-0.0.1.dev2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1784b3ddcf9388a7be04bffc1701b87fd3f0f553d69bffba65275ebfa483a11b |
|
MD5 | 0ddb7b255d9c0c76e078fcf9f3cdcb1f |
|
BLAKE2b-256 | 81282dec6a55c2219e6da3714b3c0ec1653f7b2f00e12fb2c043989239fff334 |