Skip to main content

NIXL Python API

Project description

NVIDIA Inference Xfer Library (NIXL)

NVIDIA Inference Xfer Library (NIXL) is targeted for accelerating point to point communications in AI inference frameworks such as NVIDIA Dynamo, while providing an abstraction over various types of memory (e.g., CPU and GPU) and storage (e.g., file, block and object store) through a modular plug-in architecture.

License GitHub Release

Documentation and Resources

  • NIXL overview - Core concepts/architecture overview (docs/nixl.md)

  • Python API - Python API usage and examples (docs/python_api.md)

  • Backend guide - Backend/plugin development guide (docs/BackendGuide.md)

  • Telemetry - Observability and telemetry details (docs/telemetry.md)

  • Doxygen guide - API/class diagrams overview (docs/doxygen/nixl_doxygen.md)

  • Doxygen images - Diagram assets (docs/doxygen/)

  • NIXLBench docs - Benchmark usage guide (benchmark/nixlbench/README.md)

  • KVBench docs - KVBench workflows and tutorials (benchmark/kvbench/docs/)

Supported Platforms

NIXL is supported on a Linux environment only. It is tested on Ubuntu (22.04/24.04) and Fedora. macOS and Windows are not currently supported; use a Linux host or container/VM.

Pre-build Distributions

PyPI Wheel

The nixl python API and libraries, including UCX, are available directly through PyPI. For example, if you have a GPU running on a Linux host, container, or VM, you can do the following install:

It can be installed for CUDA 12 with:

pip install nixl[cu12]

For CUDA 13 with:

pip install nixl[cu13]

For backwards compatibility, pip install nixl installs automatically nixl[cu12], continuing to work seamlessly for CUDA 12 users without requiring changes to downstream project dependencies.

If both nixl-cu12 and nixl-cu13 are installed at the same time in an environment, nixl-cu13 takes precedence.

Prerequisites for source build (Linux)

Ubuntu:

$ sudo apt install build-essential cmake pkg-config

Fedora:

$ sudo dnf install gcc-c++ cmake pkg-config

Python

$ pip3 install meson ninja pybind11 tomlkit

UCX

NIXL was tested with UCX version 1.20.x.

GDRCopy is available on Github and is necessary for maximum performance, but UCX and NIXL will work without it.

$ git clone https://github.com/openucx/ucx.git
$ cd ucx
$ git checkout v1.20.x
$ ./autogen.sh
$ ./contrib/configure-release-mt       \
    --enable-shared                    \
    --disable-static                   \
    --disable-doxygen-doc              \
    --enable-optimizations             \
    --enable-cma                       \
    --enable-devel-headers             \
    --with-cuda=<cuda install>         \
    --with-verbs                       \
    --with-dm                          \
    --with-gdrcopy=<gdrcopy install>
$ make -j
$ make -j install-strip
$ ldconfig

ETCD (Optional)

NIXL can use ETCD for metadata distribution and coordination between nodes in distributed environments. To use ETCD with NIXL:

ETCD Server and Client

$ sudo apt install etcd etcd-server etcd-client

# Or use Docker
$ docker run -d -p 2379:2379 quay.io/coreos/etcd:v3.5.1

ETCD CPP API

Installed from https://github.com/etcd-cpp-apiv3/etcd-cpp-apiv3

$ sudo apt install libgrpc-dev libgrpc++-dev libprotobuf-dev protobuf-compiler-grpc
$ sudo apt install libcpprest-dev
$ git clone https://github.com/etcd-cpp-apiv3/etcd-cpp-apiv3.git
$ cd etcd-cpp-apiv3
$ mkdir build && cd build
$ cmake ..
$ make -j$(nproc) && make install

Additional plugins

Some plugins may have additional build requirements, see them here:

Getting started

Build & install

$ meson setup <name_of_build_dir>
$ cd <name_of_build_dir>
$ ninja
$ ninja install

Build Options

Release build (default)

$ meson setup <name_of_build_dir>

Debug build

$ meson setup <name_of_build_dir> --buildtype=debug

NIXL-specific build options

# Example with custom options
$ meson setup <name_of_build_dir> \
    -Dbuild_docs=true \           # Build Doxygen documentation
    -Ducx_path=/path/to/ucx \     # Custom UCX installation path
    -Dinstall_headers=true \      # Install development headers
    -Ddisable_gds_backend=false   # Enable GDS backend

Common build options:

  • build_docs: Build Doxygen documentation (default: false)
  • ucx_path: Path to UCX installation (default: system path)
  • install_headers: Install development headers (default: true)
  • disable_gds_backend: Disable GDS backend (default: false)
  • cudapath_inc, cudapath_lib: Custom CUDA paths
  • static_plugins: Comma-separated list of plugins to build statically
  • enable_plugins: Comma-separated list of plugins to build (e.g. -Denable_plugins=UCX,POSIX). Cannot be used with disable_plugins.
  • disable_plugins: Comma-separated list of plugins to exclude (e.g. -Ddisable_plugins=GDS). Cannot be used with enable_plugins.

Environment Variables

There are a few environment variables that can be set to configure the build:

  • NIXL_NO_STUBS_FALLBACK: If not set or 0, build NIXL stub library if the library build fails

Building Documentation

If you have Doxygen installed, you can build the documentation:

# Configure with documentation enabled
$ meson setup <name_of_build_dir> -Dbuild_docs=true
$ cd <name_of_build_dir>
$ ninja

# Documentation will be generated in <name_of_build_dir>/html
# After installation (ninja install), documentation will be available in <prefix>/share/doc/nixl/

Python Interface

NIXL provides Python bindings through pybind11. For detailed Python API documentation, see docs/python_api.md.

The preferred way to install the Python bindings is through pip from PyPI:

pip install nixl[cu12]

Or for CUDA 13 with:

pip install nixl[cu13]

Installation from source

Prerequisites:

uv is always required even if you have another kind of Python virtual environment manager or if you are using a system-wide Python installation without using a virtual environment.

Example with uv Python virtual environment:

curl -LsSf https://astral.sh/uv/install.sh | sh
export PATH="$HOME/.local/bin:${PATH}"

uv venv .venv --python 3.12
source .venv/bin/activate
uv pip install tomlkit

Example with python-virtualenv:

curl -LsSf https://astral.sh/uv/install.sh | sh
export PATH="$HOME/.local/bin:${PATH}"

python3 -m venv .venv
source .venv/bin/activate
pip install tomlkit

Example with system-wide Python installation without using a virtual environment:

curl -LsSf https://astral.sh/uv/install.sh | sh
export PATH="$HOME/.local/bin:${PATH}"

pip install tomlkit

Then install PyTorch following the instructions on the PyTorch website: https://pytorch.org/get-started/locally/

After installing the prerequisites, you can build and install the NIXL binaries and the Python bindings from source. You have to:

  1. Build NIXL binaries and install them
  2. Build and install the CUDA platform-specific package (nixl-cu12 or nixl-cu13)
  3. Build and install the nixl meta-package

For CUDA 12:

pip install .
meson setup build
ninja -C build install
pip install build/src/bindings/python/nixl-meta/nixl-*-py3-none-any.whl

For CUDA 13:

pip install .
./contrib/tomlutil.py --wheel-name nixl-cu13 pyproject.toml
meson setup build
ninja -C build install
pip install build/src/bindings/python/nixl-meta/nixl-*-py3-none-any.whl

To check if the installation is successful, you can run the following command:

python3 -c "import nixl; agent = nixl.nixl_agent('agent1')"

which should print:

2026-01-08 13:36:27 NIXL INFO    _api.py:363 Backend UCX was instantiated
2026-01-08 13:36:27 NIXL INFO    _api.py:253 Initialized NIXL agent: agent1

You can also run a complete Python example to test the installation:

python3 examples/python/expanded_two_peers.py --mode=target --use_cuda=true --ip=127.0.0.1 --port=4242 &
sleep 5
python3 examples/python/expanded_two_peers.py --mode=initiator --use_cuda=true --ip=127.0.0.1 --port=4242

For more Python examples, see examples/python/.

Rust Bindings

Build

  • Use -Drust=true meson option to build rust bindings.
  • Use --buildtype=debug for a debug build (default is release).
  • Or build manually:
    $ cargo build --release
    

Install

The bindings will be installed under nixl-sys in the configured installation prefix. Can be done using ninja, from project build directory:

$ ninja install

Test

# Rust bindings tests
$ cargo test

Use in your project by adding to Cargo.toml:

[dependencies]
nixl-sys = { path = "path/to/nixl/bindings/rust" }

Other build options

See contrib/README.md for more build options.

Building Docker container

To build the docker container, first clone the current repository. Also make sure you are able to pull docker images to your machine before attempting to build the container.

Run the following from the root folder of the cloned NIXL repository:

$ ./contrib/build-container.sh

By default, the container is built with Ubuntu 24.04. To build a container for Ubuntu 22.04 use the --os option as follows:

$ ./contrib/build-container.sh --os ubuntu22

To see all the options supported by the container use:

$ ./contrib/build-container.sh -h

The container also includes a prebuilt python wheel in /workspace/dist if required for installing/distributing. Also, the wheel can be built with a separate script (see below).

Building the python wheel

The contrib folder also includes a script to build the python wheel with the UCX dependencies. Note, that UCX and other NIXL dependencies are required to be installed.

$ ./contrib/build-wheel.sh

Running with ETCD

NIXL can use ETCD for metadata exchange between distributed nodes. This is especially useful in containerized or cloud-native environments.

Environment Setup

To use ETCD with NIXL, set the following environment variables:

# Set ETCD endpoints (required) - replace localhost with the hostname of the etcd server
export NIXL_ETCD_ENDPOINTS="http://localhost:2379"

# Set ETCD namespace (optional, defaults to /nixl/agents)
export NIXL_ETCD_NAMESPACE="/nixl/agents"

Running the ETCD Example

NIXL includes an example demonstrating metadata exchange and data transfer using ETCD:

# Start an ETCD server if not already running
# For example:
# docker run -d -p 2379:2379 quay.io/coreos/etcd:v3.5.1

# Set the ETCD env variables as above

# Run the example. The two agents in the example will exchange metadata through ETCD
# and perform data transfers
./<nixl_build_path>/examples/nixl_etcd_example

nixlbench Benchmark

For more comprehensive testing, the nixlbench benchmarking tool supports ETCD for worker coordination:

# Build nixlbench (see benchmark/nixlbench/README.md for details)
cd benchmark/nixlbench
meson setup build && cd build && ninja

# Run benchmark with ETCD
./nixlbench --etcd-endpoints http://localhost:2379 --backend UCX --initiator_seg_type VRAM

Code Examples

Contributing

For contribution guidelines, see CONTRIBUTING.md (CONTRIBUTING.md).

Third-Party Components

This project will download and install additional third-party open source software projects. Review the license terms of these open source projects before use.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

nixl_cu13-0.10.0-cp314-cp314-manylinux_2_28_x86_64.whl (51.2 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.28+ x86-64

nixl_cu13-0.10.0-cp314-cp314-manylinux_2_28_aarch64.whl (49.9 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.28+ ARM64

nixl_cu13-0.10.0-cp313-cp313-manylinux_2_28_x86_64.whl (51.2 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

nixl_cu13-0.10.0-cp313-cp313-manylinux_2_28_aarch64.whl (49.9 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

nixl_cu13-0.10.0-cp312-cp312-manylinux_2_28_x86_64.whl (51.2 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

nixl_cu13-0.10.0-cp312-cp312-manylinux_2_28_aarch64.whl (49.9 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

nixl_cu13-0.10.0-cp311-cp311-manylinux_2_28_x86_64.whl (51.2 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

nixl_cu13-0.10.0-cp311-cp311-manylinux_2_28_aarch64.whl (49.9 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

nixl_cu13-0.10.0-cp310-cp310-manylinux_2_28_x86_64.whl (51.2 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

nixl_cu13-0.10.0-cp310-cp310-manylinux_2_28_aarch64.whl (49.9 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

File details

Details for the file nixl_cu13-0.10.0-cp314-cp314-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp314-cp314-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 96aa8a05ee4aa37c978efc2e5263bebfb046f0da41862291ad8077e5417492e8
MD5 fafb3904839c296d0829f59b95f5f9d0
BLAKE2b-256 097d1831e1b92f618aa2de7e57dd69be5023b1ad48bb61a433d50a2b404af8ed

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp314-cp314-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp314-cp314-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7a9bbab521bddded8c75f3d3665d82ad984b2d0100c9c476a5d5a5a37f34da7f
MD5 c0c765f445d955f19ee1307a3193fe93
BLAKE2b-256 cfaea37eb4393f2266b8083790d273e5d1e719eb4bb46272f628aeccde7f102a

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 0c7fb8b6de7eb57cc9fc2b0f5c02123c99c46f3036a982961c52eec26be09eed
MD5 839287d92afe4f383a0b2994fbdf616d
BLAKE2b-256 3504186a086acb3144186fa7ace27610aa96a91faefc07109c152b1fb82c342e

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 27c90f4d46dd3a9fd28948e12363f267dc30115a1745bf4ef1e0e2adb3d44216
MD5 99f2ba110e9c74b29c0f2956076a66f7
BLAKE2b-256 6cd20e5665e2c93199895eb680b26eeff8bf25435151320adba2756c7041d72d

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 85a7b29d9e692b5d1147dc25c2243b0ab6f57cab73e23c7f8122fb412f0aedb6
MD5 be496ce9a2a5860dc7a75eaa70c9c501
BLAKE2b-256 48d849d83eff81b35384b73daf016c1ae493a134523f2c7574fc284da9f3f06e

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7ef090096816c86c38046ddf89d52d36584a9a9ace50dfb5a6bf30165cdb11d1
MD5 d098c9633faf6b8987ff3bc6d11191b4
BLAKE2b-256 d1fa5fb3b2ce882e8a622137079f63ee3333e593f2f3021f08fb69302f05a2eb

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 c98f32a095541a43bc74bbe2bb06262bd7da32f4cf45cc00c1067bc92afff935
MD5 448ed2d3dd6d1e372df2382aa726364e
BLAKE2b-256 e30ba68bc2d8fe2eec686c0ab9585532ec7185bcebc8e21f8b2835bf3667cdfa

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 fca9949b5c1935296cbde80808b565f9d0e057fd694d0ac7cd327f6e73f216e6
MD5 0db6b3c9002c0fcb55d67d184934495e
BLAKE2b-256 8f39401b073ec00e39c76c9e0381dc36eea3607e9770219b105075efad2c4a10

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 16cb4663f01235ec28041496df1558e95feda21e20128f07454911bddcff89b8
MD5 b575644e96223e7858373713a8565d48
BLAKE2b-256 16e2b208a5295202926ba349f7cfb3ed3d3b4b940179277cdb4b85fea62e387b

See more details on using hashes here.

File details

Details for the file nixl_cu13-0.10.0-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-0.10.0-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 79ff0619868dc476f05f4cb935f7608af2ac4ccb57daef73d16f11caaa0e8a69
MD5 bf58e140fef529e731881a9a653161b8
BLAKE2b-256 efbcbf7517cdae55501bccd6bfa41f40a994ba09b385989b9dcf349fd0f9b1c8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page