Skip to main content

A high-throughput and memory-efficient inference and serving engine for LLMs

Project description

nm-vllm

Overview

This repo nm-vllm-ent contains all the source for the Neuralmagic Enterprise Edition of vllm. The nm-vllm packages built from this repo are supported enterprise distributions of vLLM. Packages are versioned Python wheels and docker images. These are released as "production level" official releases and "beta level" Nightly's.

Official releases are made at the discretion of Neuralmagic, but typically track with vllm releases. These wheels are available via "public pypi" as well as "nm-pypi".

Nightly's are released every night given green runs in automation. The wheels are available at "nm-pypi".

Installation

PyPI

The nm-vllm PyPi package includes pre-compiled binaries for CUDA (version 12.1) kernels. For other PyTorch or CUDA versions, please compile the package from source.

Install it using pip:

pip install nm-vllm --extra-index-url https://pypi.neuralmagic.com/simple

To utilize the weight sparsity features, include the optional sparse dependencies.

pip install nm-vllm[sparse] --extra-index-url https://pypi.neuralmagic.com/simple

You can also build and install nm-vllm from source (this will take ~10 minutes):

git clone https://github.com/neuralmagic/nm-vllm.git
cd nm-vllm
pip install -e .[sparse] --extra-index-url https://pypi.neuralmagic.com/simple

Docker

The nm-vllm container registry includes premade docker images.

Launch the OpenAI-compatible server with:

MODEL_ID=Qwen/Qwen2-0.5B-Instruct
docker run --gpus all --shm-size 2g ghcr.io/neuralmagic/nm-vllm-openai:latest --model $MODEL_ID

Models

Neural Magic maintains a variety of optimized models on our Hugging Face organization profiles:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

nm_vllm-0.6.3.0-cp38-abi3-manylinux_2_17_x86_64.whl (195.2 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

File details

Details for the file nm_vllm-0.6.3.0-cp38-abi3-manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for nm_vllm-0.6.3.0-cp38-abi3-manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 907f0df3440586dd244dfe4d9745d8cff52d8a7330ff21baad30b9235720a143
MD5 19f1b8ae0a41af6e78c33d156a9f0753
BLAKE2b-256 9a1f023c4d50a3c15e9570237f79b7fc5a9b6bbe4707971d0d3f75c299e23046

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page