Skip to main content

Community vLLM provider utilities for Strands Agents (OpenAI-compatible).

Project description

strands-vllm

Community vLLM utilities for the Strands Agents SDK.

vLLM serves an OpenAI-compatible API, so most users can simply use OpenAIModel with base_url. This package provides small convenience helpers and (optionally) token-id/TITO-friendly defaults.

Credit / reference

This community package is inspired by the structure and example style of horizon-rl/strands-sglang.

Install

pip install strands-vllm

vLLM server notes (tools + token IDs)

  • Tools: if you want tool calling, your vLLM server must be started with tool-calling enabled and an appropriate chat template for your model (e.g., Llama 3.2 tool template).
  • Token IDs (TITO): return_token_ids=True requests vLLM token IDs; vLLM will include prompt_token_ids and streamed token_ids when supported.

Usage

Minimal: OpenAIModel pointed at vLLM

from strands import Agent
from strands.models.openai import OpenAIModel

model = OpenAIModel(
    client_args={"api_key": "EMPTY", "base_url": "http://localhost:8000/v1"},
    model_id="AMead10/Llama-3.2-3B-Instruct-AWQ",
)

agent = Agent(model=model)
print(agent("Hi"))

Convenience: VLLMModel

from strands import Agent
from strands_vllm import VLLMModel

model = VLLMModel(
    base_url="http://localhost:8000/v1",
    model_id="AMead10/Llama-3.2-3B-Instruct-AWQ",
    return_token_ids=True,
)

agent = Agent(model=model)
print(agent("Say hello"))

Tip: if you want to print only the final result (without streaming output being printed along the way), pass callback_handler=None:

agent = Agent(model=model, callback_handler=None)
print(agent("Say hello"))

Examples

All examples can be pointed at your server with:

export VLLM_BASE_URL="http://localhost:8000/v1"
export VLLM_MODEL_ID="AMead10/Llama-3.2-3B-Instruct-AWQ"

Tools (strands-agents-tools)

Install the optional tools package and run the example:

pip install strands-agents-tools
python examples/math_agent.py

Retokenization drift (educational)

This demo mirrors the idea from strands-sglang and shows why TITO matters: encode(decode(tokens)) != tokens can happen.

pip install "strands-vllm[drift]" strands-agents-tools
python examples/retokenization_drift.py

Token-in / token-out (TITO)

If your vLLM server includes token IDs in streaming responses, you can capture them using VLLMTokenRecorder (see examples/basic_agent.py).

Development

Install from source:

git clone <your-fork-url>
cd strands-vllm
pip install -e ".[dev]"

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

strands_vllm-0.0.1.dev0.tar.gz (202.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

strands_vllm-0.0.1.dev0-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file strands_vllm-0.0.1.dev0.tar.gz.

File metadata

  • Download URL: strands_vllm-0.0.1.dev0.tar.gz
  • Upload date:
  • Size: 202.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for strands_vllm-0.0.1.dev0.tar.gz
Algorithm Hash digest
SHA256 71b9a9860f090828047ad073aa45d7c8c7ff59bbae4f718a714d73655983a7ef
MD5 fa4320cc6e72c4c0e1e6a15c273b96dc
BLAKE2b-256 e0ce3bada78a64c353554f69d84e928233143c6367417f7e9bbd837a60487f06

See more details on using hashes here.

File details

Details for the file strands_vllm-0.0.1.dev0-py3-none-any.whl.

File metadata

File hashes

Hashes for strands_vllm-0.0.1.dev0-py3-none-any.whl
Algorithm Hash digest
SHA256 b95af691b81ed6abbb0aad8f82efdfb9d7977086020ac9ac245998c512e9b10c
MD5 396e606123c0d4afb12767cf7527234b
BLAKE2b-256 e13665f9d23f9da03b7d78b175b5fd524e0e28a8cdeea7cadfcc928a52fa19d9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page