Skip to main content

A toolkit for large scale distributed training

Project description

LASER (a toolkit for Large scAle diStributEd tRaining)

A toolkit for large scale distributed training

With LARSER we succeeded to train DeBERTa 1.5B model without model parallelism. The DeBERTa 1.5B model is the SOAT model on GLUE and SuperGLUE leaderboard. And it's the first model that surpass T5 11B model and human performance on SuperGLUE leaderboard.

TODOs

  • Add documentation and usage examples

git version: 57143200814583410acdd0c5ac0a0f8bab8a1f7e date: 2021-02-04 09:55:12.622124

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

LASER-0.0.5-py3-none-any.whl (22.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page