Skip to main content

vLLM Ascend backend plugin

Project description

vllm-ascend

vLLM Ascend Plugin

| About Ascend | Documentation | #sig-ascend | Users Forum | Weekly Meeting |

English | 中文


Latest News 🔥

  • [2025/12] We released the new official version v0.11.0! Please follow the official guide to start using vLLM Ascend Plugin on Ascend.
  • [2025/09] We released the new official version v0.9.1! Please follow the official guide to start deploy large scale Expert Parallelism (EP) on Ascend.
  • [2025/08] We hosted the vLLM Beijing Meetup with vLLM and Tencent! Please find the meetup slides here.
  • [2025/06] User stories page is now live! It kicks off with ‌LLaMA-Factory/verl//TRL/GPUStack‌ to demonstrate how ‌vLLM Ascend‌ assists Ascend users in enhancing their experience across fine-tuning, evaluation, reinforcement learning (RL), and deployment scenarios.
  • [2025/06] Contributors page is now live! All contributions deserve to be recorded, thanks for all contributors.
  • [2025/05] We've released first official version v0.7.3! We collaborated with the vLLM community to publish a blog post sharing our practice: Introducing vLLM Hardware Plugin, Best Practice from Ascend NPU.
  • [2025/03] We hosted the vLLM Beijing Meetup with vLLM team! Please find the meetup slides here.
  • [2025/02] vLLM community officially created vllm-project/vllm-ascend repo for running vLLM seamlessly on the Ascend NPU.
  • [2024/12] We are working with the vLLM community to support [RFC]: Hardware pluggable.

Overview

vLLM Ascend (vllm-ascend) is a community maintained hardware plugin for running vLLM seamlessly on the Ascend NPU.

It is the recommended approach for supporting the Ascend backend within the vLLM community. It adheres to the principles outlined in the [RFC]: Hardware pluggable, providing a hardware-pluggable interface that decouples the integration of the Ascend NPU with vLLM.

By using vLLM Ascend plugin, popular open-source models, including Transformer-like, Mixture-of-Expert, Embedding, Multi-modal LLMs can run seamlessly on the Ascend NPU.

Prerequisites

  • Hardware: Atlas 800I A2 Inference series, Atlas A2 Training series, Atlas 800I A3 Inference series, Atlas A3 Training series, Atlas 300I Duo (Experimental)
  • OS: Linux
  • Software:
    • Python >= 3.10, < 3.12
    • CANN == 8.3.rc2 (Ascend HDK version refers to here)
    • PyTorch == 2.8.0, torch-npu == 2.8.0
    • vLLM (the same version as vllm-ascend)

Getting Started

Please use the following recommended versions to get started quickly:

Version Release type Doc
v0.13.0rc1 Latest release candidate QuickStart and Installation for more details
v0.11.0 Latest stable version QuickStart and Installation for more details

Contributing

See CONTRIBUTING for more details, which is a step-by-step guide to help you set up development environment, build and test.

We welcome and value any contributions and collaborations:

Branch

vllm-ascend has main branch and dev branch.

  • main: main branch,corresponds to the vLLM main branch, and is continuously monitored for quality through Ascend CI.
  • vX.Y.Z-dev: development branch, created with part of new releases of vLLM. For example, v0.7.3-dev is the dev branch for vLLM v0.7.3 version.

Below is maintained branches:

Branch Status Note
main Maintained CI commitment for vLLM main branch and vLLM v0.13.0 tag
v0.7.1-dev Unmaintained Only doc fixed is allowed
v0.7.3-dev Maintained CI commitment for vLLM 0.7.3 version, only bug fix is allowed and no new release tag any more.
v0.9.1-dev Maintained CI commitment for vLLM 0.9.1 version
v0.11.0-dev Maintained CI commitment for vLLM 0.11.0 version
rfc/feature-name Maintained Feature branches for collaboration

Please refer to Versioning policy for more details.

Weekly Meeting

License

Apache License 2.0, as found in the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vllm_ascend-0.13.0.tar.gz (4.5 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

vllm_ascend-0.13.0-cp311-cp311-manylinux_2_24_x86_64.whl (22.5 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.24+ x86-64

vllm_ascend-0.13.0-cp311-cp311-manylinux_2_24_aarch64.whl (22.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.24+ ARM64

vllm_ascend-0.13.0-cp310-cp310-manylinux_2_24_x86_64.whl (22.5 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.24+ x86-64

vllm_ascend-0.13.0-cp310-cp310-manylinux_2_24_aarch64.whl (22.4 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.24+ ARM64

File details

Details for the file vllm_ascend-0.13.0.tar.gz.

File metadata

  • Download URL: vllm_ascend-0.13.0.tar.gz
  • Upload date:
  • Size: 4.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for vllm_ascend-0.13.0.tar.gz
Algorithm Hash digest
SHA256 ab0e9bc12b9a4042c261901ccad96f4faab6e6d04cf5131f7cc2358aafda3a36
MD5 2d5fa851cb4eac5d9f3f8972513d0d8c
BLAKE2b-256 fd516bffc54d5caf0a684c3b9c34d281eb52898e7c367389027bedb134d0e72c

See more details on using hashes here.

File details

Details for the file vllm_ascend-0.13.0-cp311-cp311-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for vllm_ascend-0.13.0-cp311-cp311-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 3359986af39959d09c35806b39431e3229a47f5f152e3127f4beb41d81fce5b1
MD5 b2348a159d8a532e2e5aea7f1077eac0
BLAKE2b-256 f68fb68811aeb441c74e65a9d4c412c2ccdb65e411c50dd9554152bca8e6fb96

See more details on using hashes here.

File details

Details for the file vllm_ascend-0.13.0-cp311-cp311-manylinux_2_24_aarch64.whl.

File metadata

File hashes

Hashes for vllm_ascend-0.13.0-cp311-cp311-manylinux_2_24_aarch64.whl
Algorithm Hash digest
SHA256 b2cac71d96c655af316dbdb2dd52ec5461c69dda5dc9c1835893d00318cd6e31
MD5 7071c9257be6382598ca88e694f8a7a5
BLAKE2b-256 3f13d4521cb7134eaca9add098f005067a5a45dc81a497344a99d06caad1a92b

See more details on using hashes here.

File details

Details for the file vllm_ascend-0.13.0-cp310-cp310-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for vllm_ascend-0.13.0-cp310-cp310-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 0b19f98e2ac91770ad82388eea0a4720cd5b63d60ec1136658e0e095ae94d14d
MD5 08722afb20ad3ad918097ecfd9aedcaa
BLAKE2b-256 11997ac9c92c9017c73382c2ce2a5e65f772f6489dbb0e01724305246cf6e63f

See more details on using hashes here.

File details

Details for the file vllm_ascend-0.13.0-cp310-cp310-manylinux_2_24_aarch64.whl.

File metadata

File hashes

Hashes for vllm_ascend-0.13.0-cp310-cp310-manylinux_2_24_aarch64.whl
Algorithm Hash digest
SHA256 60dc3bff1e11cfba502fff520a2d2e5619eba1a3c879afb43730d9e0999bc316
MD5 5f3bb485fb9e2b8fa639a9020cfee725
BLAKE2b-256 38a3d11bd5cdb760a6ff2758719589e152fdd3c7cafa7ead300f3c42c99e0a9e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page