Skip to main content

NPU bridge for PyTorch

Project description

Ascend Extension for PyTorch

Overview

This repository develops the Ascend Extension for PyTorch named torch_npu to adapt Ascend NPU to PyTorch so that developers who use the PyTorch can obtain powerful compute capabilities of Ascend AI Processors.

Ascend is a full-stack AI computing infrastructure for industry applications and services based on Huawei Ascend processors and software. For more information about Ascend, see Ascend Community.

Installation

From Binary

Provide users with wheel package to quickly install torch_npu. Before installing torch_npu, complete the installation of CANN according to Ascend Auxiliary Software. To obtain the CANN installation package, refer to the CANN Installation.

  1. Install PyTorch

    Install PyTorch through pip.

    For Aarch64:

    pip3 install torch==2.12.0 --index-url https://download.pytorch.org/whl/cpu
    

    For x86:

    pip3 install torch==2.12.0+cpu  --index-url https://download.pytorch.org/whl/cpu
    
  2. Install torch-npu dependencies

    Run the following command to install dependencies.

    pip3 install pyyaml
    pip3 install setuptools
    

    If the installation fails, use the download link or visit the PyTorch official website to download the installation package of the corresponding version.

    OS arch Python version link
    x86 Python3.10 link
    x86 Python3.11 link
    x86 Python3.12 link
    aarch64 Python3.10 link
    aarch64 Python3.11 link
    aarch64 Python3.12 link
  3. Install torch-npu

    pip3 install torch-npu==2.12.0rc1
    

From Source

In some special scenarios, users may need to compile torch-npu by themselves.Select a branch in table Ascend Auxiliary Software and a Python version in table PyTorch and Python Version Matching Table first. The docker image is recommended for compiling torch-npu through the following steps(It is recommended to mount the working path only and avoid the system path to reduce security risks.), the generated .whl file path is ./dist/. Note that gcc version has the following constraints if you try to compile without using docker image: we recommend to use gcc 13.3 for both ARM and X86.

  1. Clone torch-npu

    git clone https://gitcode.com/ascend/pytorch.git -b v2.12.0 --depth 1
    
  2. Build Docker Image

    cd pytorch/ci/docker/{arch} # {arch} for X86 or ARM
    docker build -t manylinux-builder:v1 .
    
  3. Enter Docker Container

    docker run -it -v /{code_path}/pytorch:/home/pytorch manylinux-builder:v1 bash
    # {code_path} is the torch_npu source code path
    
  4. Compile torch-npu

    Take Python 3.10 as an example.

    cd /home/pytorch
    bash ci/build.sh --python=3.10
    

    Use --torch=<version> to build against a specific PyTorch version (supported: 2.10.0, 2.11.0, 2.12.0). The installed PyTorch must match the specified version.

    bash ci/build.sh --python=3.8 --torch=2.10.0
    

Tips

If you would like to compile with new C++ ABI, then first run this command, at this point, the recommended compilation environment is same to community torch package: glibc 2.28, gcc 13.3

export _GLIBCXX_USE_CXX11_ABI=1

Meanwhile, we support configuring -fabi-version using the following variables,require consistency with the community torch package

export _ABI_VERSION=18

Getting Started

Prerequisites

Initialize CANN environment variable by running the command as shown below.

# Default path, change it if needed.
source /usr/local/Ascend/ascend-toolkit/set_env.sh

Quick Verification

You can quickly experience Ascend NPU by the following simple examples.

import torch
- import torch_npu # No longer needed in torch_npu 2.5.1 and later versions

x = torch.randn(2, 2).npu()
y = torch.randn(2, 2).npu()
z = x.mm(y)

print(z)

User Manual

Refer to API of Ascend Extension for PyTorch for more detailed information.

PyTorch and Python Version Matching Table

PyTorch Version Python Version
PyTorch1.11.0 Python3.7.x(>=3.7.5),Python3.8.x,Python3.9.x,Python3.10.x
PyTorch2.1.0 Python3.8.x,Python3.9.x,Python3.10.x,Python3.11.x
PyTorch2.2.0 Python3.8.x,Python3.9.x,Python3.10.x
PyTorch2.3.1 Python3.8.x,Python3.9.x,Python3.10.x,Python3.11.x
PyTorch2.4.0 Python3.8.x,Python3.9.x,Python3.10.x,Python3.11.x
PyTorch2.5.1 Python3.9.x,Python3.10.x,Python3.11.x
PyTorch2.6.0 Python3.9.x,Python3.10.x,Python3.11.x
PyTorch2.7.1 Python3.9.x,Python3.10.x,Python3.11.x
PyTorch2.8.0 Python3.9.x,Python3.10.x,Python3.11.x
PyTorch2.9.0 Python3.10.x,Python3.11.x,Python3.12.x
PyTorch2.10.0 Python3.10.x,Python3.11.x,Python3.12.x
PyTorch2.11.0 Python3.10.x,Python3.11.x,Python3.12.x
PyTorch2.12.0 Python3.10.x,Python3.11.x,Python3.12.x

Ascend Auxiliary Software

PyTorch Extension versions follow the naming convention {PyTorch version}-{Ascend version}, where the former represents the PyTorch version compatible with the PyTorch Extension, and the latter is used to match the CANN version. The detailed matching is as follows:

CANN Version Supported PyTorch Version Supported Extension Version Github Branch
CANN 8.5.0 2.12.0 2.12.0rc1 v2.12.0
2.11.0 2.11.0rc1 v2.11.0
2.10.0 2.10.0rc2 v2.10.0
2.9.0 2.9.0 v2.9.0-7.3.0
2.8.0 2.8.0.post2 v2.8.0-7.3.0
2.7.1 2.7.1.post2 v2.7.1-7.3.0
2.6.0 2.6.0.post5 v2.6.0-7.3.0
CANN 8.3.RC1 2.8.0 2.8.0 v2.8.0-7.2.0
2.7.1 2.7.1 v2.7.1-7.2.0
2.6.0 2.6.0.post3 v2.6.0-7.2.0
2.1.0 2.1.0.post17 v2.1.0-7.2.0
CANN 8.2.RC1 2.6.0 2.6.0 v2.6.0-7.1.0
2.5.1 2.5.1.post1 v2.5.1-7.1.0
2.1.0 2.1.0.post13 v2.1.0-7.1.0
CANN 8.1.RC1 2.5.1 2.5.1 v2.5.1-7.0.0
2.4.0 2.4.0.post4 v2.4.0-7.0.0
2.3.1 2.3.1.post6 v2.3.1-7.0.0
2.1.0 2.1.0.post12 v2.1.0-7.0.0
CANN 8.0.0 2.4.0 2.4.0.post2 v2.4.0-6.0.0
2.3.1 2.3.1.post4 v2.3.1-6.0.0
2.1.0 2.1.0.post10 v2.1.0-6.0.0
CANN 8.0.RC3 2.4.0 2.4.0 v2.4.0-6.0.rc3
2.3.1 2.3.1.post2 v2.3.1-6.0.rc3
2.1.0 2.1.0.post8 v2.1.0-6.0.rc3
CANN 8.0.RC2 2.3.1 2.3.1 v2.3.1-6.0.rc2
2.2.0 2.2.0.post2 v2.2.0-6.0.rc2
2.1.0 2.1.0.post6 v2.1.0-6.0.rc2
1.11.0 1.11.0.post14 v1.11.0-6.0.rc2
CANN 8.0.RC1 2.2.0 2.2.0 v2.2.0-6.0.rc1
2.1.0 2.1.0.post4 v2.1.0-6.0.rc1
1.11.0 1.11.0.post11 v1.11.0-6.0.rc1
CANN 7.0.0 2.1.0 2.1.0 v2.1.0-5.0.0
2.0.1 2.0.1.post1 v2.0.1-5.0.0
1.11.0 1.11.0.post8 v1.11.0-5.0.0
CANN 7.0.RC1 2.1.0 2.1.0.rc1 v2.1.0-5.0.rc3
2.0.1 2.0.1 v2.0.1-5.0.rc3
1.11.0 1.11.0.post4 v1.11.0-5.0.rc3
CANN 6.3.RC3.1 1.11.0 1.11.0.post3 v1.11.0-5.0.rc2.2
CANN 6.3.RC3 1.11.0 1.11.0.post2 v1.11.0-5.0.rc2.1
CANN 6.3.RC2 2.0.1 2.0.1.rc1 v2.0.1-5.0.rc2
1.11.0 1.11.0.post1 v1.11.0-5.0.rc2
1.8.1 1.8.1.post2 v1.8.1-5.0.rc2
CANN 6.3.RC1 1.11.0 1.11.0 v1.11.0-5.0.rc1
1.8.1 1.8.1.post1 v1.8.1-5.0.rc1
CANN 6.0.1 1.5.0 1.5.0.post8 v1.5.0-3.0.0
1.8.1 1.8.1 v1.8.1-3.0.0
1.11.0 1.11.0.rc2(beta) v1.11.0-3.0.0
CANN 6.0.RC1 1.5.0 1.5.0.post7 v1.5.0-3.0.rc3
1.8.1 1.8.1.rc3 v1.8.1-3.0.rc3
1.11.0 1.11.0.rc1(beta) v1.11.0-3.0.rc3
CANN 5.1.RC2 1.5.0 1.5.0.post6 v1.5.0-3.0.rc2
1.8.1 1.8.1.rc2 v1.8.1-3.0.rc2
CANN 5.1.RC1 1.5.0 1.5.0.post5 v1.5.0-3.0.rc1
1.8.1 1.8.1.rc1 v1.8.1-3.0.rc1
CANN 5.0.4 1.5.0 1.5.0.post4 2.0.4.tr5
CANN 5.0.3 1.8.1 1.5.0.post3 2.0.3.tr5
CANN 5.0.2 1.5.0 1.5.0.post2 2.0.2.tr5

Hardware support

The Ascend training device includes the following models, all of which can be used as training environments for PyTorch models

Product series Product model
Atlas Training series products Atlas 800(model: 9000)
Atlas 800(model:9010)
Atlas 900 PoD(model:9000)
Atlas 300T(model:9000)
Atlas 300T Pro(model:9000)
Atlas A2 Training series products Atlas 800T A2
Atlas 900 A2 PoD
Atlas 200T A2 Box16
Atlas 300T A2

The Ascend inference device includes the following models, all of which can be used as inference environments for large models

Product series Product model
Atlas 800I A2 Inference product Atlas 800I A2

Pipeline Status

Due to the asynchronous development mechanism of upstream and downstream, incompatible modifications in upstream may cause some functions of torch_npu to be unavailable (only upstream and downstream development branches are involved, excluding stable branches). Therefore, we built a set of daily tasks that make it easy to detect relevant issues in time and fix them within 48 hours (under normal circumstances), providing users with the latest features and stable quality.

OS CANN Version(Docker Image) Upstream Branch Downstream Branch Period Status
openEuler 24.03 SP2 CANN 8.5 main master UTC 1200 daily Ascend NPU

Suggestions and Communication

Everyone is welcome to contribute to the community. If you have any questions or suggestions, you can submit Github Issues. We will reply to you as soon as possible. Thank you very much.

Branch Maintenance Policies

The version branches of AscendPyTorch have the following maintenance phases:

Status Duration Description
Planning 1-3 months Plan features.
Development 6-12 months Develop new features and fix issues, regularly release new versions. Different strategies are adopted for different versions of PyTorch, with a regular branch development cycle of 6 months and a long-term support branch development cycle of 12 months.
Maintained 1 year/3.5 years Regular Release branch for 1 year, Long Term Support branch maintenance for 3.5 years. Fix major issues, do not incorporate new features, and release patch versions based on the impact of fixed bugs.
End Of Life (EOL) N/A Do not accept any modification to a branch.

PyTorch Maintenance Policies

PyTorch Maintenance Policies Status Launch Date Subsequent Status EOL Date
2.12.0 Regular Release Development 2026/05/10 Expected to enter maintenance status from November 23, 2026 -
2.11.0 Regular Release Development 2026/03/23 Expected to enter maintenance status from September 23, 2026 -
2.10.0 Regular Release Development 2026/01/22 Expected to enter maintenance status from July 22, 2026 -
2.9.0 Regular Release Development 2026/01/15 Expected to enter maintenance status from July 15, 2026 -
2.8.0 Regular Release Development 2025/10/15 Expected to enter maintenance status from March 15, 2026 -
2.7.1 Long Term Support Development 2025/10/15 Expected to enter maintenance status from October 15, 2026
2.6.0 Regular Release Development 2025/07/25 Expected to enter maintenance status from January 15, 2026 -
2.5.1 Regular Release Maintained 2024/11/08 Expected to enter maintenance free status from August 8, 2026
2.4.0 Regular Release Maintained 2024/10/15 Expected to enter maintenance free status from June 15, 2026
2.3.1 Regular Release Maintained 2024/06/06 Expected to enter maintenance free status from June 7, 2026
2.2.0 Regular Release EOL 2024/04/01 2025/10/14
2.1.0 Long Term Support Maintained 2023/10/15 Expected to enter maintenance free status from December 30, 2026
2.0.1 Regular Release EOL 2023/7/19 2024/3/14
1.11.0 Long Term Support EOL 2023/4/19 2025/10/25
1.8.1 Long Term Support EOL 2022/4/10 2023/4/10
1.5.0 Long Term Support EOL 2021/7/29 2022/7/29

Reference Documents

For more detailed information on installation guides, model migration, training/inference tutorials, and API lists, please refer to the Ascend Extension for PyTorch on the HiAI Community.

Document Name Document Link
Installation Guide link
Network Model Migration and Training link
Operator Adaptation link
Ascend Extension for PyTorch (Custom Interfaces) link

License

Ascend Extension for PyTorch has a BSD-style license, as found in the LICENSE file.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

torch_npu-2.12.0rc1-cp313-cp313-manylinux_2_28_x86_64.whl (43.8 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

torch_npu-2.12.0rc1-cp313-cp313-manylinux_2_28_aarch64.whl (38.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

torch_npu-2.12.0rc1-cp312-cp312-manylinux_2_28_x86_64.whl (43.8 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

torch_npu-2.12.0rc1-cp312-cp312-manylinux_2_28_aarch64.whl (38.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

torch_npu-2.12.0rc1-cp311-cp311-manylinux_2_28_x86_64.whl (43.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

torch_npu-2.12.0rc1-cp311-cp311-manylinux_2_28_aarch64.whl (38.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

torch_npu-2.12.0rc1-cp310-cp310-manylinux_2_28_x86_64.whl (43.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

torch_npu-2.12.0rc1-cp310-cp310-manylinux_2_28_aarch64.whl (38.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

File details

Details for the file torch_npu-2.12.0rc1-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 c45bead6b4bc2d736737d3039eb362aaa1b1a0a1e44e3499020f3b4ae7a813ed
MD5 4d60ba26720c21d6ff62a87be719c5e7
BLAKE2b-256 a9b9929d6fbc635b15ddadc508079f4c6ec8028402193920aee791918d921685

See more details on using hashes here.

File details

Details for the file torch_npu-2.12.0rc1-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 2d2b1f499d8149c5e60dc5c802545f6cb6ddcd4213f0717ba4ad9d5fdfe75ddd
MD5 da9eda50391b0f4899184f4365127797
BLAKE2b-256 6f83a9b2db143e2fd3bc0ff235e0b4fc0bf5913f6189db53d9faeafea8181fc3

See more details on using hashes here.

File details

Details for the file torch_npu-2.12.0rc1-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 030c075c0ca744536d8e65e553d80ed2e2c4f600136e7453fd67fdff361acb72
MD5 4d4b66421d1840653b7853baeecd6bce
BLAKE2b-256 2b34b8d8d5473b87c99958e34d8d1dfd0ca90d42b235f56f8a862fa1bf51b39e

See more details on using hashes here.

File details

Details for the file torch_npu-2.12.0rc1-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 002dd0cd1e425ceb8ea81dcd1b4494989cbf467594afe9b0a6f0252abe30d0df
MD5 a09ca388066b6353551b87f6f0ed51b5
BLAKE2b-256 d93c1ce7dc906d40d395c0c0e608318f74d4d2c14ff42a9547606ca8a78bfbd7

See more details on using hashes here.

File details

Details for the file torch_npu-2.12.0rc1-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 153b5fbf69489895989884980655924b3a8e66a45f51cbf90a03cd636179bc56
MD5 5a421643348a6a4529b3e3a39b20bff4
BLAKE2b-256 d145d11a07eb3e7b753fb00e4b7f93dc6ce9de6939f7c1ad4282bac766312104

See more details on using hashes here.

File details

Details for the file torch_npu-2.12.0rc1-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 0607a77907998cbfc718d392cacc13c7846809be0004c048a281648a1b14ecc3
MD5 10bebacb4bf5421c7e1bfe95c76d2d0e
BLAKE2b-256 8ff455b0d94884d72f58667b82a1c1a35ddf5cb698c7a10a5a76da81905d4260

See more details on using hashes here.

File details

Details for the file torch_npu-2.12.0rc1-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 c6589e5131784da66ffa8de7ba821cc8eb919bc93f0c85e27f51fc2b89149431
MD5 f08441556dec3a0c01542af9deea75c8
BLAKE2b-256 3ee93211ecfdd339331b96bc1937697502d0454feebc5a4a50f34bb53ab8c054

See more details on using hashes here.

File details

Details for the file torch_npu-2.12.0rc1-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torch_npu-2.12.0rc1-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 effdfad0d57e3386da1aee6c2bc67bb3d14b47484a49485460d629976e164bfd
MD5 39f6d08952c3961a1e05b8f7c8a27ef5
BLAKE2b-256 666758918e92254604008607d3c6a88017023f7ebf91f8b7f3adbaa6d5123a69

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page