Skip to main content

Package for Synthetic Data Generation using Distributional Learninig of VAE

Project description

DistVAE-Tabular

DistVAE is a novel approach to distributional learning in the VAE framework, focusing on accurately capturing the underlying distribution of the observed dataset through a nonparametric CDF estimation.

We utilize the continuous ranked probability score (CRPS), a strictly proper scoring rule, as the reconstruction loss while preserving the mathematical derivation of the lower bound of the data log-likelihood. Additionally, we introduce a synthetic data generation mechanism that effectively preserves differential privacy.

For a detailed method explanations, check our paper! (link)

1. Installation

Install using pip:

pip install distvae-tabular

2. Usage

from distvae_tabular import distvae
distvae.DistVAE # DistVAE model
distvae.generate_data # generate synthetic data

Example

"""device setting"""
import torch
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

"""load dataset and specify column types"""
import pandas as pd
data = pd.read_csv('./loan.csv') 
continuous_features = [
    'Age',
    'Experience',
    'Income', 
    'CCAvg',
    'Mortgage',
]
categorical_features = [
    'Family',
    'Personal Loan',
    'Securities Account',
    'CD Account',
    'Online',
    'CreditCard'
]
integer_features = [
    'Age',
    'Experience',
    'Income', 
    'Mortgage'
]

"""DistVAE"""
from distvae_tabular import distvae

distvae = distvae.DistVAE(
    data=data, # the observed tabular dataset
    continuous_features=continuous_features, # the list of continuous columns of data
    categorical_features=categorical_features, # the list of categorical columns of data
    integer_features=integer_features, # the list of integer-type columns of data
    
    seed=42, # seed for repeatable results
    latent_dim=4, # the latent dimension size
    beta=0.1, # scale parameter of asymmetric Laplace distribution
    hidden_dim=128, # the number of nodes in MLP
    
    epochs=5, # the number of epochs (for quick checking)
    batch_size=256, # the batch size
    lr=0.001, # learning rate
    
    step=0.1, # interval size between knots
    threshold=1e-8, # threshold for clipping alpha_tild (numerical stability)
    device="cpu"
)

"""training"""
distvae.train()

"""generate synthetic data"""
syndata = distvae.generate_data(100)
syndata

"""generate synthetic data with Differential Privacy"""
syndata = distvae.generate_data(100, lambda_=0.1)
syndata

Citation

If you use this code or package, please cite our associated paper:

@article{an2024distributional,
  title={Distributional learning of variational AutoEncoder: application to synthetic data generation},
  author={An, Seunghwan and Jeon, Jong-June},
  journal={Advances in Neural Information Processing Systems},
  volume={36},
  year={2024}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

distvae_tabular-0.1.4.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

distvae_tabular-0.1.4-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file distvae_tabular-0.1.4.tar.gz.

File metadata

  • Download URL: distvae_tabular-0.1.4.tar.gz
  • Upload date:
  • Size: 9.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for distvae_tabular-0.1.4.tar.gz
Algorithm Hash digest
SHA256 6e727018c42cf568d1583c0f45f25dbdd4090d237d3eb4f223ebde226f782d8b
MD5 25290e866ce720267df169c7d730887f
BLAKE2b-256 69a8c8f2671750e4c30a153d15a8d3f78c34abeea7d5afa507276fc663cbe96c

See more details on using hashes here.

File details

Details for the file distvae_tabular-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for distvae_tabular-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 6779739365794816c98b567abb757fdeea2d8067311e54192b20f3adf26a1e48
MD5 9e0b219b56366811605083514372954b
BLAKE2b-256 554ea1f95e8839319eb70c2c448593abcb49329ebc0c1a4560db336733ec8752

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page