Skip to main content

Megatron Core - a library for efficient and scalable training of transformer based models

Project description

Megatron-Core

Megatron-Core is an open-source PyTorch-based library that contains GPU-optimized techniques and cutting-edge system-level optimizations. It abstracts them into composable and modular APIs, allowing full flexibility for developers and model researchers to train custom transformers at-scale on NVIDIA accelerated computing infrastructure. This library is compatible with all NVIDIA Tensor Core GPUs, including FP8 acceleration support for NVIDIA Hopper architectures.

Megatron-Core offers core building blocks such as attention mechanisms, transformer blocks and layers, normalization layers, and embedding techniques. Additional functionality like activation re-computation, distributed checkpointing is also natively built-in to the library. The building blocks and functionality are all GPU optimized, and can be built with advanced parallelization strategies for optimal training speed and stability on NVIDIA Accelerated Computing Infrastructure. Another key component of the Megatron-Core library includes advanced model parallelism techniques (tensor, sequence, pipeline, context, and MoE expert parallelism).

Megatron-Core can be used with NVIDIA NeMo, an enterprise-grade AI platform. Alternatively, you can explore Megatron-Core with the native PyTorch training loop here. Visit Megatron-Core documentation to learn more.

Quick links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

megatron_core-0.8.0-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (1.6 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.24+ x86-64 manylinux: glibc 2.28+ x86-64

megatron_core-0.8.0-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.24+ x86-64 manylinux: glibc 2.28+ x86-64

File details

Details for the file megatron_core-0.8.0-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for megatron_core-0.8.0-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 2f9201f55044ec95c9e4f0ce2b66eef6ea40665b6883fd72a0bfe59862051ff2
MD5 8c03654e9fdd7c07fc22ea5e7a4690ef
BLAKE2b-256 28f1be4a1ee67735680fd2f2ce361dacca20d22edb08234fa70d885930931ea4

See more details on using hashes here.

File details

Details for the file megatron_core-0.8.0-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for megatron_core-0.8.0-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 3b76fbffb61ced25b5f59ba0e9f77145e218563d0cc81d7386891edd765eb441
MD5 817244dcc722fb35778ee22129508e08
BLAKE2b-256 218fcde694349c8534f3acc777ccaadf8ce2b2a12cf3c5c67110f46fd42e5a68

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page