Skip to main content

A high-throughput and memory-efficient inference and serving engine for LLMs

Project description

nm-vllm

Overview

This repo nm-vllm-ent contains all the source for the Neuralmagic Enterprise Edition of vllm. The nm-vllm packages built from this repo are supported enterprise distributions of vLLM. Packages are versioned Python wheels and docker images. These are released as "production level" official releases and "beta level" Nightly's.

Official releases are made at the discretion of Neuralmagic, but typically track with vllm releases. These wheels are available via "public pypi" as well as "nm-pypi".

Nightly's are released every night given green runs in automation. The wheels are available at "nm-pypi".

Installation

PyPI

The nm-vllm PyPi package includes pre-compiled binaries for CUDA (version 12.1) kernels. For other PyTorch or CUDA versions, please compile the package from source.

Install it using pip:

pip install nm-vllm --extra-index-url https://pypi.neuralmagic.com/simple

To utilize the weight sparsity features, include the optional sparse dependencies.

pip install nm-vllm[sparse] --extra-index-url https://pypi.neuralmagic.com/simple

You can also build and install nm-vllm from source (this will take ~10 minutes):

git clone https://github.com/neuralmagic/nm-vllm.git
cd nm-vllm
pip install -e .[sparse] --extra-index-url https://pypi.neuralmagic.com/simple

Docker

The nm-vllm container registry includes premade docker images.

Launch the OpenAI-compatible server with:

MODEL_ID=Qwen/Qwen2-0.5B-Instruct
docker run --gpus all --shm-size 2g ghcr.io/neuralmagic/nm-vllm-openai:latest --model $MODEL_ID

Models

Neural Magic maintains a variety of optimized models on our Hugging Face organization profiles:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

nm_vllm-0.5.1.1-cp38-abi3-manylinux_2_17_x86_64.whl (146.9 MB view hashes)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page