Skip to main content

A library for running multiview autoencoder models

Project description

Build Status Documentation Status Python Version DOI version codecov downloads

Multi-view-AE: Multi-modal representation learning using autoencoders

multi-view-AE is a collection of multi-modal autoencoder models for learning joint representations from multiple modalities of data. The package is structured such that all models have fit, predict_latents and predict_reconstruction methods. All models are built in Pytorch and Pytorch-Lightning.

For more information on implemented models and how to use the package, please see the documentation.

Models Implemented

Below is a table with the models contained within this repository and links to the original papers.

Model class Model name Number of views Original work
mcVAE Multi-Channel Variational Autoencoder (mcVAE) >=1 link
AE Multi-view Autoencoder >=1
AAE Multi-view Adversarial Autoencoder with separate latent representations >=1
DVCCA Deep Variational CCA 2 link
jointAAE Multi-view Adversarial Autoencoder with joint latent representation >=1
wAAE Multi-view Adversarial Autoencoder with joint latent representation and wasserstein loss >=1
mmVAE Variational mixture-of-experts autoencoder (MMVAE) >=1 link
mVAE Multimodal Variational Autoencoder (MVAE) >=1 link
me_mVAE Multimodal Variational Autoencoder (MVAE) with separate ELBO terms for each view >=1 link
JMVAE Joint Multimodal Variational Autoencoder(JMVAE-kl) 2 link
MVTCAE Multi-View Total Correlation Auto-Encoder (MVTCAE) >=1 link
MoPoEVAE Mixture-of-Products-of-Experts VAE >=1 link
mmJSD Multimodal Jensen-Shannon divergence model (mmJSD) >=1 link
weighted_mVAE Generalised Product-of-Experts Variational Autoencoder (gPoE-MVAE) >=1 link
VAE_barlow Multi-view Variational Autoencoder with barlow twins loss between latents. 2 link,link
AE_barlow Multi-view Autoencoder with barlow twins loss between latents. 2 link,link
DMVAE Disentangled multi-modal variational autoencoder >=1 link
weighted_DMVAE Disentangled multi-modal variational autoencoder with gPoE joint posterior >=1

Installation

To install our package via pip:

pip install multiviewae

Or, clone this repository and move to folder:

git clone https://github.com/alawryaguila/multi-view-AE
cd multi-view-AE

Create the customised python environment:

conda create --name mvae python=3.9

Activate python environment:

conda activate mvae

Install the multi-view-AE package:

pip install ./

Citation

If you have used multi-view-AE in your research, please consider citing our JOSS paper:

Aguila et al., (2023). Multi-view-AE: A Python package for multi-view autoencoder models. Journal of Open Source Software, 8(85), 5093, https://doi.org/10.21105/joss.05093

Bibtex entry:

@article{Aguila2023, 
doi = {10.21105/joss.05093}, 
url = {https://doi.org/10.21105/joss.05093}, 
year = {2023}, 
publisher = {The Open Journal}, 
volume = {8}, 
number = {85}, 
pages = {5093}, 
author = {Ana Lawry Aguila and Alejandra Jayme and Nina Montaña-Brown and Vincent Heuveline and Andre Altmann}, 
title = {Multi-view-AE: A Python package for multi-view autoencoder models}, journal = {Journal of Open Source Software} 
}

Contribution guidelines

Contribution guidelines are available at https://multi-view-ae.readthedocs.io/en/latest/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

multiviewae-1.1.6.tar.gz (42.4 kB view details)

Uploaded Source

Built Distribution

multiviewae-1.1.6-py3-none-any.whl (74.8 kB view details)

Uploaded Python 3

File details

Details for the file multiviewae-1.1.6.tar.gz.

File metadata

  • Download URL: multiviewae-1.1.6.tar.gz
  • Upload date:
  • Size: 42.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.8.18 Linux/6.2.0-1015-azure

File hashes

Hashes for multiviewae-1.1.6.tar.gz
Algorithm Hash digest
SHA256 98f38360a228e1b6ea3f5fbf0435b04092f83b33eb8b7489ce2051e2d4bc5c8f
MD5 f27ca3634f1a060d2ecfc05bdef25a08
BLAKE2b-256 06d4d30783a7ba08d75491c7549528740d06c90daf506a50c20e993e9641377f

See more details on using hashes here.

File details

Details for the file multiviewae-1.1.6-py3-none-any.whl.

File metadata

  • Download URL: multiviewae-1.1.6-py3-none-any.whl
  • Upload date:
  • Size: 74.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.8.18 Linux/6.2.0-1015-azure

File hashes

Hashes for multiviewae-1.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 adc85973ec797aa19c0c5aff2c24dca43ce28f3733b00fb3e426a7f4166c1bdd
MD5 51b1b84e3da44adc21e37543b539bf5f
BLAKE2b-256 1891ae5da6564cb124226e6664d3cf8171d84acf152fc7700f71aceb3d94b292

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page