Skip to main content

vLLM plugin for Spyre hardware support

Project description

SenDNN Inference

| Documentation | Users Forum | #sig-spyre |


IBM Spyre is the first production-grade Artificial Intelligence Unit (AIU) accelerator born out of the IBM Research AIU family, and is part of a long-term strategy of developing novel architectures and full-stack technology solutions for the emerging space of generative AI. Spyre builds on the foundation of IBM's internal AIU research and delivers a scalable, efficient architecture for accelerating AI in enterprise environments.

SenDNN Inference (sendnn-inference) is a vLLM plugin that enables seamless integration of IBM Spyre Accelerator with vLLM. It follows the architecture described in vLLM's Plugin System, making it easy to integrate IBM's advanced AI acceleration into existing vLLM workflows.

For more information, check out the following:

Getting Started

Visit our documentation:

Contributing

We welcome and value any contributions and collaborations. Please check out Contributing to SenDNN Inference for how to get involved.

Contact

You can reach out for discussion or support in the #sig-spyre channel in the vLLM Slack workspace or by opening an issue.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sendnn_inference-2.0.0.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sendnn_inference-2.0.0-py3-none-any.whl (109.5 kB view details)

Uploaded Python 3

File details

Details for the file sendnn_inference-2.0.0.tar.gz.

File metadata

  • Download URL: sendnn_inference-2.0.0.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for sendnn_inference-2.0.0.tar.gz
Algorithm Hash digest
SHA256 eec959e478405c7da4caf6cbf34bde33f05ba4805ac169e13c193fa4690950aa
MD5 938d6a6361cbbe1a3240ee9878c46591
BLAKE2b-256 5f310b569555813f7845f069d5549746b81bdd92b087f1e466083ee132beda48

See more details on using hashes here.

Provenance

The following attestation bundles were made for sendnn_inference-2.0.0.tar.gz:

Publisher: build_and_publish.yaml on torch-spyre/sendnn-inference

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file sendnn_inference-2.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for sendnn_inference-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c7ee7af26bf1a1f0acd63b6b8c473efd14bd82031c4ff9026badb33a184564dd
MD5 e9bfa9355a96e96e049bfc6e7e3260f4
BLAKE2b-256 d4de8e8509244b3cc4b366a1bcd0773da8ff9e3d346a2260f2c04671cdc299d3

See more details on using hashes here.

Provenance

The following attestation bundles were made for sendnn_inference-2.0.0-py3-none-any.whl:

Publisher: build_and_publish.yaml on torch-spyre/sendnn-inference

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page