Skip to main content

NIXL Python API

Project description

NVIDIA Inference Xfer Library (NIXL)

NVIDIA Inference Xfer Library (NIXL) is targeted for accelerating point to point communications in AI inference frameworks such as NVIDIA Dynamo, while providing an abstraction over various types of memory (e.g., CPU and GPU) and storage (e.g., file, block and object store) through a modular plug-in architecture.

License GitHub Release

Documentation and Resources

  • NIXL overview - Core concepts/architecture overview (docs/nixl.md)

  • Python API - Python API usage and examples (docs/python_api.md)

  • Backend guide - Backend/plugin development guide (docs/BackendGuide.md)

  • Telemetry - Observability and telemetry details (docs/telemetry.md)

  • Doxygen guide - API/class diagrams overview (docs/doxygen/nixl_doxygen.md)

  • Doxygen images - Diagram assets (docs/doxygen/)

  • NIXLBench docs - Benchmark usage guide (benchmark/nixlbench/README.md)

  • KVBench docs - KVBench workflows and tutorials (benchmark/kvbench/docs/)

Supported Platforms

NIXL is supported on a Linux environment only. It is tested on Ubuntu (22.04/24.04) and Fedora. macOS and Windows are not currently supported; use a Linux host or container/VM.

Pre-build Distributions

PyPI Wheel

The nixl python API and libraries, including UCX, are available directly through PyPI. For example, if you have a GPU running on a Linux host, container, or VM, you can do the following install:

It can be installed for CUDA 12 with:

pip install nixl[cu12]

For CUDA 13 with:

pip install nixl[cu13]

For backwards compatibility, pip install nixl installs automatically nixl[cu12], continuing to work seamlessly for CUDA 12 users without requiring changes to downstream project dependencies.

If both nixl-cu12 and nixl-cu13 are installed at the same time in an environment, nixl-cu13 takes precedence.

Prerequisites for source build (Linux)

Ubuntu:

$ sudo apt install build-essential cmake pkg-config

Fedora:

$ sudo dnf install gcc-c++ cmake pkg-config

Python

$ pip3 install meson ninja pybind11 tomlkit

UCX

NIXL was tested with UCX version 1.20.x.

GDRCopy is available on Github and is necessary for maximum performance, but UCX and NIXL will work without it.

$ git clone https://github.com/openucx/ucx.git
$ cd ucx
$ git checkout v1.20.x
$ ./autogen.sh
$ ./contrib/configure-release-mt       \
    --enable-shared                    \
    --disable-static                   \
    --disable-doxygen-doc              \
    --enable-optimizations             \
    --enable-cma                       \
    --enable-devel-headers             \
    --with-cuda=<cuda install>         \
    --with-verbs                       \
    --with-dm                          \
    --with-gdrcopy=<gdrcopy install>
$ make -j
$ make -j install-strip
$ ldconfig

ETCD (Optional)

NIXL can use ETCD for metadata distribution and coordination between nodes in distributed environments. To use ETCD with NIXL:

ETCD Server and Client

$ sudo apt install etcd etcd-server etcd-client

# Or use Docker
$ docker run -d -p 2379:2379 quay.io/coreos/etcd:v3.5.1

ETCD CPP API

Installed from https://github.com/etcd-cpp-apiv3/etcd-cpp-apiv3

$ sudo apt install libgrpc-dev libgrpc++-dev libprotobuf-dev protobuf-compiler-grpc
$ sudo apt install libcpprest-dev
$ git clone https://github.com/etcd-cpp-apiv3/etcd-cpp-apiv3.git
$ cd etcd-cpp-apiv3
$ mkdir build && cd build
$ cmake ..
$ make -j$(nproc) && make install

Additional plugins

Some plugins may have additional build requirements, see them here:

Getting started

Build & install

$ meson setup <name_of_build_dir>
$ cd <name_of_build_dir>
$ ninja
$ ninja install

Build Options

Release build (default)

$ meson setup <name_of_build_dir>

Debug build

$ meson setup <name_of_build_dir> --buildtype=debug

NIXL-specific build options

# Example with custom options
$ meson setup <name_of_build_dir> \
    -Dbuild_docs=true \           # Build Doxygen documentation
    -Ducx_path=/path/to/ucx \     # Custom UCX installation path
    -Dinstall_headers=true \      # Install development headers
    -Ddisable_gds_backend=false   # Enable GDS backend

Common build options:

  • build_docs: Build Doxygen documentation (default: false)
  • ucx_path: Path to UCX installation (default: system path)
  • install_headers: Install development headers (default: true)
  • disable_gds_backend: Disable GDS backend (default: false)
  • cudapath_inc, cudapath_lib: Custom CUDA paths
  • static_plugins: Comma-separated list of plugins to build statically
  • enable_plugins: Comma-separated list of plugins to build (e.g. -Denable_plugins=UCX,POSIX). Cannot be used with disable_plugins.
  • disable_plugins: Comma-separated list of plugins to exclude (e.g. -Ddisable_plugins=GDS). Cannot be used with enable_plugins.

Environment Variables

There are a few environment variables that can be set to configure the build:

  • NIXL_NO_STUBS_FALLBACK: If not set or 0, build NIXL stub library if the library build fails

Building Documentation

If you have Doxygen installed, you can build the documentation:

# Configure with documentation enabled
$ meson setup <name_of_build_dir> -Dbuild_docs=true
$ cd <name_of_build_dir>
$ ninja

# Documentation will be generated in <name_of_build_dir>/html
# After installation (ninja install), documentation will be available in <prefix>/share/doc/nixl/

Python Interface

NIXL provides Python bindings through pybind11. For detailed Python API documentation, see docs/python_api.md.

The preferred way to install the Python bindings is through pip from PyPI:

pip install nixl[cu12]

Or for CUDA 13 with:

pip install nixl[cu13]

Installation from source

Prerequisites:

uv is always required even if you have another kind of Python virtual environment manager or if you are using a system-wide Python installation without using a virtual environment.

Example with uv Python virtual environment:

curl -LsSf https://astral.sh/uv/install.sh | sh
export PATH="$HOME/.local/bin:${PATH}"

uv venv .venv --python 3.12
source .venv/bin/activate
uv pip install tomlkit

Example with python-virtualenv:

curl -LsSf https://astral.sh/uv/install.sh | sh
export PATH="$HOME/.local/bin:${PATH}"

python3 -m venv .venv
source .venv/bin/activate
pip install tomlkit

Example with system-wide Python installation without using a virtual environment:

curl -LsSf https://astral.sh/uv/install.sh | sh
export PATH="$HOME/.local/bin:${PATH}"

pip install tomlkit

Then install PyTorch following the instructions on the PyTorch website: https://pytorch.org/get-started/locally/

After installing the prerequisites, you can build and install the NIXL binaries and the Python bindings from source. You have to:

  1. Build NIXL binaries and install them
  2. Build and install the CUDA platform-specific package (nixl-cu12 or nixl-cu13)
  3. Build and install the nixl meta-package

For CUDA 12:

pip install .
meson setup build
ninja -C build install
pip install build/src/bindings/python/nixl-meta/nixl-*-py3-none-any.whl

For CUDA 13:

pip install .
./contrib/tomlutil.py --wheel-name nixl-cu13 pyproject.toml
meson setup build
ninja -C build install
pip install build/src/bindings/python/nixl-meta/nixl-*-py3-none-any.whl

To check if the installation is successful, you can run the following command:

python3 -c "import nixl; agent = nixl.nixl_agent('agent1')"

which should print:

2026-01-08 13:36:27 NIXL INFO    _api.py:363 Backend UCX was instantiated
2026-01-08 13:36:27 NIXL INFO    _api.py:253 Initialized NIXL agent: agent1

You can also run a complete Python example to test the installation:

python3 examples/python/expanded_two_peers.py --mode=target --use_cuda=true --ip=127.0.0.1 --port=4242 &
sleep 5
python3 examples/python/expanded_two_peers.py --mode=initiator --use_cuda=true --ip=127.0.0.1 --port=4242

For more Python examples, see examples/python/.

Rust Bindings

Build

  • Use -Drust=true meson option to build rust bindings.
  • Use --buildtype=debug for a debug build (default is release).
  • Or build manually:
    $ cargo build --release
    

Install

The bindings will be installed under nixl-sys in the configured installation prefix. Can be done using ninja, from project build directory:

$ ninja install

Test

# Rust bindings tests
$ cargo test

Use in your project by adding to Cargo.toml:

[dependencies]
nixl-sys = { path = "path/to/nixl/bindings/rust" }

Other build options

See contrib/README.md for more build options.

Building Docker container

To build the docker container, first clone the current repository. Also make sure you are able to pull docker images to your machine before attempting to build the container.

Run the following from the root folder of the cloned NIXL repository:

$ ./contrib/build-container.sh

By default, the container is built with Ubuntu 24.04. To build a container for Ubuntu 22.04 use the --os option as follows:

$ ./contrib/build-container.sh --os ubuntu22

To see all the options supported by the container use:

$ ./contrib/build-container.sh -h

The container also includes a prebuilt python wheel in /workspace/dist if required for installing/distributing. Also, the wheel can be built with a separate script (see below).

Building the python wheel

The contrib folder also includes a script to build the python wheel with the UCX dependencies. Note, that UCX and other NIXL dependencies are required to be installed.

$ ./contrib/build-wheel.sh

Running with ETCD

NIXL can use ETCD for metadata exchange between distributed nodes. This is especially useful in containerized or cloud-native environments.

Environment Setup

To use ETCD with NIXL, set the following environment variables:

# Set ETCD endpoints (required) - replace localhost with the hostname of the etcd server
export NIXL_ETCD_ENDPOINTS="http://localhost:2379"

# Set ETCD namespace (optional, defaults to /nixl/agents)
export NIXL_ETCD_NAMESPACE="/nixl/agents"

Running the ETCD Example

NIXL includes an example demonstrating metadata exchange and data transfer using ETCD:

# Start an ETCD server if not already running
# For example:
# docker run -d -p 2379:2379 quay.io/coreos/etcd:v3.5.1

# Set the ETCD env variables as above

# Run the example. The two agents in the example will exchange metadata through ETCD
# and perform data transfers
./<nixl_build_path>/examples/nixl_etcd_example

nixlbench Benchmark

For more comprehensive testing, the nixlbench benchmarking tool supports ETCD for worker coordination:

# Build nixlbench (see benchmark/nixlbench/README.md for details)
cd benchmark/nixlbench
meson setup build && cd build && ninja

# Run benchmark with ETCD
./nixlbench --etcd-endpoints http://localhost:2379 --backend UCX --initiator_seg_type VRAM

Code Examples

Contributing

For contribution guidelines, see CONTRIBUTING.md (CONTRIBUTING.md).

Third-Party Components

This project will download and install additional third-party open source software projects. Review the license terms of these open source projects before use.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

nixl_cu13-1.0.1-cp314-cp314-manylinux_2_28_x86_64.whl (50.5 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.28+ x86-64

nixl_cu13-1.0.1-cp314-cp314-manylinux_2_28_aarch64.whl (49.2 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.28+ ARM64

nixl_cu13-1.0.1-cp313-cp313-manylinux_2_28_x86_64.whl (50.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

nixl_cu13-1.0.1-cp313-cp313-manylinux_2_28_aarch64.whl (49.2 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

nixl_cu13-1.0.1-cp312-cp312-manylinux_2_28_x86_64.whl (50.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

nixl_cu13-1.0.1-cp312-cp312-manylinux_2_28_aarch64.whl (49.2 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

nixl_cu13-1.0.1-cp311-cp311-manylinux_2_28_x86_64.whl (50.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

nixl_cu13-1.0.1-cp311-cp311-manylinux_2_28_aarch64.whl (49.2 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

nixl_cu13-1.0.1-cp310-cp310-manylinux_2_28_x86_64.whl (50.4 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

nixl_cu13-1.0.1-cp310-cp310-manylinux_2_28_aarch64.whl (49.2 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

File details

Details for the file nixl_cu13-1.0.1-cp314-cp314-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp314-cp314-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 dde218558617d93cb0952b4a15a0ec6667cc726422939e366376e0f952753903
MD5 9f2db15012284b1f22c47f76ee41adc1
BLAKE2b-256 5c42dfa8f1770e2f02bcf455243ce5dfc0dba8d6ac3a1614179185076f23320c

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp314-cp314-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp314-cp314-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 059278069b98a40722e04d710f537c03b4b7b9935b70c3dc2ec4f09f66daa454
MD5 e2448673626b5a9178776bd62a04ac0b
BLAKE2b-256 fa22c8d749176aa572a384445775e4b67adba22b2ff1bf511e99ce1ed6149768

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 9ce6a3b395c3be76b7f9a87401db375adbb5a04460e6a01cf86b02801c428f37
MD5 f629e38eb8516e535e1841e128ca001c
BLAKE2b-256 2ec9039e51296b490608308df831d1f7e14f200b195956758d66fa9cd19555e1

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 e1d73d89863f3b3c123b1be57165b7217252417b3aa3806ea1ff062a4acd0190
MD5 06bb03262c932e79faa935b35794b004
BLAKE2b-256 63481ffa13cc9d1e4bb61fff6754c0797afdc57fbb289e73514d3798a193939b

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f21e8abb0b4f3d4e2e09c15db1f8a798ee8402bf1542804b866b89b92cce951e
MD5 39db320da650008ec70e1b2eafc31734
BLAKE2b-256 15fdd85118513fd663d4799da37c67ca7a1fde47b1da1ced1b438a92f53a6fd7

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 461c18d6b0382772cfaffff8de30d47b1e20706741521bde6e367ccbbc37bb01
MD5 c2204537168f5af264dbfd2d6653e5cd
BLAKE2b-256 a5bc785add3aa078bf64bd6b944400d753f529da1294521b29c4d0e4fffdc5fe

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 056fa1335a4c46439e42e8e65c3b69bf2f9b6e79bcfa34b8ba4f2c28a9e17769
MD5 a42f7bcc6fe514a9a010ebf7c6b49011
BLAKE2b-256 ab8260fc228341f5dae3cbb084270dd71030dd2315563dc6105b77d7e872b13b

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 6abcb31c1a8df204d8738d18a2e362c54c60938cc8faaf13f5c39dc0abcfab98
MD5 5685e76cf4646290397ebd4b85b91ee9
BLAKE2b-256 6f5a75171ea1c4de4a8465a02d71dd42ea0d8495a8a1f334724c88f25fd03ca5

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 953e4911a5f58d7e58d97fe97576cb47841a76201c6e21e1b7d9e36869a2e271
MD5 ae61c933d4c62397d93cdeb1428f4ea7
BLAKE2b-256 67eaade782dbc4a0424d7ca932797570e38d2e11160f06f6937413a180a3a4fc

See more details on using hashes here.

File details

Details for the file nixl_cu13-1.0.1-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for nixl_cu13-1.0.1-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 a4a0b4c4f377c6dd86e10ae47798e6ba1ec372aa11243113b1cf7fc8292562d4
MD5 28dcb1316d0cf27d6d72819cc0f34e9a
BLAKE2b-256 aba0bf7354ae82494e9bb4fba35c1cdd15b535ad99e50b0896072674642ecbb6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page