GAN-based models to flash-simulate the LHCb PID detectors
Project description
PIDGAN
GAN-based models to flash-simulate the LHCb PID detectors
Generative Adversarial Networks
Algorithms* | Avail | Test | Lipschitzianity** | Design inspired by | Tutorial |
---|---|---|---|---|---|
GAN |
✅ | ✅ | ❌ | 1, 8, 9 | |
BceGAN |
✅ | ✅ | ❌ | 2, 8, 9 | |
LSGAN |
✅ | ✅ | ❌ | 3, 8, 9 | |
WGAN |
✅ | ✅ | ✅ | 4, 9 | |
WGAN_GP |
✅ | ✅ | ✅ | 5, 9 | |
CramerGAN |
✅ | ✅ | ✅ | 6, 9 | |
WGAN_ALP |
✅ | ✅ | ✅ | 7, 9 | |
BceGAN_GP |
✅ | ✅ | ✅ | 2, 5, 9 | |
BceGAN_ALP |
✅ | ✅ | ✅ | 2, 7, 9 |
*each GAN algorithm is designed to operate taking conditions as input [10]
**the GAN training is regularized to ensure that the discriminator encodes a 1-Lipschitz function
Generators
Players | Avail | Test | Inherit from | Design inspired by |
---|---|---|---|---|
Generator |
✅ | ✅ | tf.keras.Model |
1, 10 |
ResGenerator |
✅ | ✅ | Generator |
1, 10, 11 |
Discriminators
Players | Avail | Test | Inherit from | Design inspired by |
---|---|---|---|---|
Discriminator |
✅ | ✅ | tf.keras.Model |
1, 9, 10 |
ResDiscriminator |
✅ | ✅ | Discriminator |
1, 9, 10, 11 |
AuxDiscriminator |
✅ | ✅ | ResDiscriminator |
1, 9, 10, 11, 12 |
Other players
Players | Avail | Test | Inherit from |
---|---|---|---|
Classifier |
✅ | ✅ | Discriminator |
ResClassifier |
✅ | ✅ | ResDiscriminator |
AuxClassifier |
✅ | ✅ | AuxDiscriminator |
MultiClassifier |
✅ | ✅ | Discriminator |
MultiResClassifier |
✅ | ✅ | ResDiscriminator |
AuxMultiClassifier |
✅ | ✅ | AuxDiscriminator |
References
- I.J. Goodfellow et al., "Generative Adversarial Networks", arXiv:1406.2661
- A. Radford, L. Metz, S. Chintala, "Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks", arXiv:1511.06434
- X. Mao et al., "Least Squares Generative Adversarial Networks", arXiv:1611.04076
- M. Arjovsky, S. Chintala, L. Bottou, "Wasserstein GAN", arXiv:1701.07875
- I. Gulrajani et al., "Improved Training of Wasserstein GANs", arXiv:1704.00028
- M.G. Bellemare et al., "The Cramer Distance as a Solution to Biased Wasserstein Gradients", arXiv:1705.10743
- D. Terjék, "Adversarial Lipschitz Regularization", arXiv:1907.05681
- M. Arjovsky, L. Bottou, "Towards Principled Methods for Training Generative Adversarial Networks", arXiv:1701.04862
- T. Salimans et al., "Improved Techniques for Training GANs", arXiv:1606.03498
- M. Mirza, S. Osindero, "Conditional Generative Adversarial Nets", arXiv:1411.1784
- K. He et al., "Deep Residual Learning for Image Recognition", arXiv:1512.03385
- A. Rogachev, F. Ratnikov, "GAN with an Auxiliary Regressor for the Fast Simulation of the Electromagnetic Calorimeter Response", arXiv:2207.06329
Credits
Most of the GAN algorithms are an evolution of what provided by the mbarbetti/tf-gen-models repository. The BceGAN
model is freely inspired by the TensorFlow tutorial Deep Convolutional Generative Adversarial Network and the Keras tutorial Conditional GAN. The WGAN_ALP
model is an adaptation of what provided by the dterjek/adversarial_lipschitz_regularization repository.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pidgan-0.1.2.tar.gz
(39.5 kB
view hashes)
Built Distribution
pidgan-0.1.2-py3-none-any.whl
(61.1 kB
view hashes)