Skip to main content

GAN-based models to flash-simulate the LHCb PID detectors

Project description

PIDGAN

GAN-based models to flash-simulate the LHCb PID detectors

TensorFlow versions scikit-learn versions Python versions PyPI - Version GitHub - License

GitHub - Tests Codecov

GitHub - Style Ruff

Generative Adversarial Networks

Algorithms* Avail Test Lipschitzianity** Design inspired by Tutorial
GAN 1, 8, 9 Open In Colab
BceGAN 2, 8, 9 Open In Colab
LSGAN 3, 8, 9 Open In Colab
WGAN 4, 9 Open In Colab
WGAN_GP 5, 9 Open In Colab
CramerGAN 6, 9 Open In Colab
WGAN_ALP 7, 9 Open In Colab
BceGAN_GP 2, 5, 9 Open In Colab
BceGAN_ALP 2, 7, 9 Open In Colab

*each GAN algorithm is designed to operate taking conditions as input [10]

**the GAN training is regularized to ensure that the discriminator encodes a 1-Lipschitz function

Generators

Players Avail Test Inherit from Design inspired by
Generator tf.keras.Model 1, 10
ResGenerator Generator 1, 10, 11

Discriminators

Players Avail Test Inherit from Design inspired by
Discriminator tf.keras.Model 1, 9, 10
ResDiscriminator Discriminator 1, 9, 10, 11
AuxDiscriminator ResDiscriminator 1, 9, 10, 11, 12

Other players

Players Avail Test Inherit from
Classifier Discriminator
ResClassifier ResDiscriminator
AuxClassifier AuxDiscriminator
MultiClassifier Discriminator
MultiResClassifier ResDiscriminator
AuxMultiClassifier AuxDiscriminator

References

  1. I.J. Goodfellow et al., "Generative Adversarial Networks", arXiv:1406.2661
  2. A. Radford, L. Metz, S. Chintala, "Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks", arXiv:1511.06434
  3. X. Mao et al., "Least Squares Generative Adversarial Networks", arXiv:1611.04076
  4. M. Arjovsky, S. Chintala, L. Bottou, "Wasserstein GAN", arXiv:1701.07875
  5. I. Gulrajani et al., "Improved Training of Wasserstein GANs", arXiv:1704.00028
  6. M.G. Bellemare et al., "The Cramer Distance as a Solution to Biased Wasserstein Gradients", arXiv:1705.10743
  7. D. Terjék, "Adversarial Lipschitz Regularization", arXiv:1907.05681
  8. M. Arjovsky, L. Bottou, "Towards Principled Methods for Training Generative Adversarial Networks", arXiv:1701.04862
  9. T. Salimans et al., "Improved Techniques for Training GANs", arXiv:1606.03498
  10. M. Mirza, S. Osindero, "Conditional Generative Adversarial Nets", arXiv:1411.1784
  11. K. He et al., "Deep Residual Learning for Image Recognition", arXiv:1512.03385
  12. A. Rogachev, F. Ratnikov, "GAN with an Auxiliary Regressor for the Fast Simulation of the Electromagnetic Calorimeter Response", arXiv:2207.06329

Credits

Most of the GAN algorithms are an evolution of what provided by the mbarbetti/tf-gen-models repository. The BceGAN model is freely inspired by the TensorFlow tutorial Deep Convolutional Generative Adversarial Network and the Keras tutorial Conditional GAN. The WGAN_ALP model is an adaptation of what provided by the dterjek/adversarial_lipschitz_regularization repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pidgan-0.1.3.tar.gz (39.9 kB view hashes)

Uploaded Source

Built Distribution

pidgan-0.1.3-py3-none-any.whl (61.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page