Skip to main content

Transformer-based models to fast-simulate the LHCb ECAL detector

Project description

calotron logo

Transformer-based models to fast-simulate the LHCb ECAL detector

TensorFlow versions Python versions PyPI - Version GitHub - License

GitHub - Tests Codecov

GitHub - Style Code style: black

Transformer

The Transformer architecture is freely inspired by Vaswani et al. [arXiv:1706.03762] and Dosovitskiy et al. [arXiv:2010.11929].

calotron transformer architecture

Discriminator

The Discriminator is implemented through the Deep Sets model proposed by Zaheer et al. [arXiv:1703.06114] and its architecture is freely inspired by what developed by the ATLAS Collaboration for flavor tagging [ATL-PHYS-PUB-2020-014].

calotron discriminator architecture

Credits

Transformer implementation freely inspired by the TensorFlow tutorial Neural machine translation with a Transformer and Keras.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

calotron-0.0.12.tar.gz (41.2 kB view hashes)

Uploaded Source

Built Distribution

calotron-0.0.12-py3-none-any.whl (68.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page