Skip to main content

ONNX Runtime generate() API

Project description

ONNX Runtime generate() API

Run SLMs/LLMs and multi modal models on-device and in the cloud with ONNX Runtime.

Model architectures supported so far (and more coming soon): Gemma, Llama, Mistral, Phi (language and vision).

For more details, see: docs https://onnxruntime.ai/docs/genai and repo: https://github.com/microsoft/onnxruntime-genai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_genai_directml-0.3.0-cp312-cp312-win_amd64.whl (16.1 MB view hashes)

Uploaded CPython 3.12 Windows x86-64

onnxruntime_genai_directml-0.3.0-cp311-cp311-win_amd64.whl (16.1 MB view hashes)

Uploaded CPython 3.11 Windows x86-64

onnxruntime_genai_directml-0.3.0-cp310-cp310-win_amd64.whl (16.1 MB view hashes)

Uploaded CPython 3.10 Windows x86-64

onnxruntime_genai_directml-0.3.0-cp39-cp39-win_amd64.whl (16.1 MB view hashes)

Uploaded CPython 3.9 Windows x86-64

onnxruntime_genai_directml-0.3.0-cp38-cp38-win_amd64.whl (16.1 MB view hashes)

Uploaded CPython 3.8 Windows x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page