Skip to main content

A hybrid model combining VAE, GAN, and LightGBM for boosting performance in high-energy physics or data analysis tasks.

Project description

VaganBoost: Hybrid VAE-GAN + LightGBM for Advanced Classification 0.7.6

VAGANBoost Logo

Introduction

VAGANBoost is a hybrid generative model combining Variational Autoencoders (VAE) and Generative Adversarial Networks (GAN) with boosting techniques to enhance high-energy gamma-ray analysis.

Outlines

  • Implements cVAE+cGAN and cGAN+cVAE+RandomForest models
  • Designed for high-energy physics applications
  • Utilizes deep learning and gradient boosting techniques

Key Features

  • Hybrid Architecture: Combines deep generative models with gradient boosting
  • VAE-GAN Integration: Joint latent space learning for improved feature representation
  • LightGBM Classifier: State-of-the-art gradient boosting for final classification
  • Automatic Feature Fusion: Combines VAE latent features with GAN-generated features
  • Visualization Tools: Built-in metrics visualization and feature analysis
  • PyTorch Backend: GPU-accelerated training with seamless CUDA support

Key Features Table

Feature Description Benefit
VAE-GAN Fusion Combines reconstruction power of VAEs with GANs' generative capabilities Enhanced feature learning
LightGBM Integration Gradient boosting on learned features Superior classification performance
Automatic GPU Support Seamless CUDA integration Faster training on supported hardware
Dynamic Feature Fusion Combines latent and generated features Improved representation learning
Visualization Suite Built-in metrics plotting Easy model evaluation

Troubleshooting

Common Issues:

  1. CUDA Out of Memory: Reduce batch size or input dimensions
  2. Poor Classification Performance:
    • Increase VAE latent dimensions
    • Adjust GAN-LightGBM feature ratio
  3. Training Instability:
    model = VaganBoost(
        ...,
        vae_kl_weight=0.5,  # Adjust KL loss weight
        gan_gp_weight=10.0  # Add gradient penalty
    )
    

Installation

Prerequisites

  • Python 3.6+
  • NVIDIA GPU (recommended) with CUDA 11.0+

Install via pip

pip install vaganboost

From source

git clone https://github.com/AliBavarchee/vaganboost.git
cd vaganboost
pip install -e .

Quick Start

Basic Usage

from vaganboost import VaganBoost, load_data, split_data, normalize_data

# Prepare data
X, y = load_data("data.csv", target_column="label")
X_train, X_test, y_train, y_test = split_data(X, y, test_size=0.2)
X_train_norm, X_test_norm = normalize_data(X_train, X_test)

# Initialize model
model = VaganBoost(
    vae_input_dim=X_train_norm.shape[1],
    vae_latent_dim=64,
    gan_input_dim=100,
    num_class=4,
    device="cuda"
)

# Train components
model.train_vae(X_train_norm, epochs=100)
model.train_gan(X_train_norm, epochs=50)
model.train_lgbm(X_train_norm, y_train)

# Evaluate
accuracy = model.evaluate(X_test_norm, y_test)
print(f"Test Accuracy: {accuracy:.2%}")

Advanced Configuration

# Custom LightGBM parameters
lgbm_params = {
    'objective': 'multiclass',
    'num_class': 4,
    'metric': 'multi_logloss',
    'num_leaves': 63,
    'learning_rate': 0.1,
    'feature_fraction': 0.7
}

model = VaganBoost(
    vae_input_dim=128,
    vae_latent_dim=64,
    gan_input_dim=100,
    num_class=4,
    lgbm_params=lgbm_params,
    device="cuda"
)

Documentation

Core Components

Module Description
data_utils Data loading, splitting, and normalization
models VAE, GAN, and LightGBM implementations
train Joint training procedures
utils Visualization and evaluation tools

Dependencies

See requirements.txt for required packages.

License

License: MIT

This project is licensed under the MIT License.

Python 3.6+

Contact Ali Bavarchee - ali.bavarchee@gmail.com

Project Link: https://github.com/AliBavarchee/vaganboost

=============================================

ALI BAVARCHIEE

=============================================


| https://www.linkedin.com/in/ali-bavarchee-qip/ |

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vaganboost-0.7.7.tar.gz (14.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vaganboost-0.7.7-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file vaganboost-0.7.7.tar.gz.

File metadata

  • Download URL: vaganboost-0.7.7.tar.gz
  • Upload date:
  • Size: 14.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for vaganboost-0.7.7.tar.gz
Algorithm Hash digest
SHA256 fbdae0167a8f53391a1d3798c3d005280adf7f1a153dcfe300766153e378ca39
MD5 92af32254e328231b074650712b0cc7c
BLAKE2b-256 9b5df9f8613443c22645c6b839089041e7838940effb1a781033e739c5848b63

See more details on using hashes here.

File details

Details for the file vaganboost-0.7.7-py3-none-any.whl.

File metadata

  • Download URL: vaganboost-0.7.7-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for vaganboost-0.7.7-py3-none-any.whl
Algorithm Hash digest
SHA256 78362f7fc7798d4d440a13d00b02981c75637782f57accd6995514f243f07939
MD5 09e84a7a7045eb4e28c7ea53f4612fe5
BLAKE2b-256 72f09b821fb5093b988ab387fe95310cf3a014c4abdc6d5bbad6c6e8d0fde2ff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page