Skip to main content

TurnkeyML Tools and Models

Project description

Welcome to ONNX TurnkeyML

Turnkey tests OS - Windows | Linux Made with Python

We are on a mission to make it easy to use the most important tools in the ONNX ecosystem. TurnkeyML accomplishes this by providing a no-code CLI, turnkey, as well as a low-code API, that provide seamless integration of these tools.

We also provide turnkey-llm, which has LLM-specific tools for prompting, accuracy measurement, and serving on a variety of runtimes (Huggingface, onnxruntime-genai) and hardware (CPU, GPU, and NPU).

Getting Started

Quick Start

The easiest way to get started is:

  1. pip install turnkeyml
  2. Copy a PyTorch example of a model, like the one on this Huggingface BERT model card, into a file named bert.py.
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertModel.from_pretrained("bert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
  1. turnkey -i bert.py discover export-pytorch: make a BERT ONNX file from this bert.py example.

LLMs

For LLM setup instructions, see turnkey-llm.

Demo

Here's turnkey in action: BERT-Base is exported from PyTorch to ONNX using torch.onnx.export, optimized for inference with onnxruntime, and converted to fp16 with onnxmltools:

Basic Demo Video

Breaking down the command turnkey -i bert.py discover export-pytorch optimize-ort convert-fp16:

  1. turnkey -i bert.py feeds bert.py, a minimal PyTorch script that instantiates BERT, into the tool sequence, starting with...
  2. discover is a tool that finds the PyTorch model in a script and passes it to the next tool, which is...
  3. export-pytorch, which takes a PyTorch model and converts it to an ONNX model, then passes it to...
  4. optimize-ort, which uses onnxruntime to optimize the model's compute graph, then passes it to...
  5. convert-fp16, which uses onnxmltools to convert the ONNX file into fp16.
  6. Finally, the result is printed, and we can see that the requested .onnx files have been produced.

All without writing a single line of code or learning how to use any of the underlying ONNX ecosystem tools 🚀

How It Works

The turnkey CLI provides a set of Tools that users can invoke in a Sequence. The first Tool takes the input (-i), performs some action, and passes its state to the next Tool in the Sequence.

You can read the Sequence out like a sentence. For example, the demo command above was:

> turnkey -i bert.py discover export-pytorch optimize-ort convert-fp16

Which you can read like:

Use turnkey on bert.py to discover the model, export the pytorch to ONNX, optimize the ONNX with ort, and convert the ONNX to fp16.

You can configure each Tool by passing it arguments. For example, export-pytorch --opset 18 would set the opset of the resulting ONNX model to 18.

A full command with an argument looks like:

> turnkey -i bert.py discover export-pytorch --opset 18 optimize-ort conver-fp16

Learn More

The easiest way to learn more about turnkey is to explore the help menu with turnkey -h. To learn about a specific tool, run turnkey <tool name> -h, for example turnkey export-pytorch -h.

We also provide the following resources:

  • Installation guide: how to install from source, set up Slurm, etc.
  • User guide: explains the concepts of turnkey's, including the syntax for making your own tool sequence.
  • Examples: PyTorch scripts and ONNX files that can be used to try out turnkey concepts.
  • Code organization guide: learn how this repository is structured.
  • Models: PyTorch model scripts that work with turnkey.

Mass Evaluation

turnkey is used in multiple projects where many hundreds of models are being evaluated. For example, the ONNX Model Zoo was created using turnkey.

We provide several helpful tools to facilitate this kind of mass-evaluation.

Wildcard Input

turnkey will iterate over multiple inputs if you pass it a wildcard input.

For example, to export ~1000 built-in models to ONNX:

> turnkey models/*/*.py discover export-pytorch

Results Cache

All build results, such as .onnx files, are collected into a cache directory, which you can learn about with turnkey cache -h.

Generating Reports

turnkey collects statistics about each model and build into the corresponding build directory in the cache. Use turnkey report -h to see how those statistics can be exported into a CSV file.

Extensibility

Models

transformers graph_convolutions torch_hub torchvision timm

This repository is home to a diverse corpus of hundreds of models, which are meant to be a convenient input to turnkey -i <model>.py discover. We are actively working on increasing the number of models in our model library. You can see the set of models in each category by clicking on the corresponding badge.

Evaluating a new model is as simple as taking a Python script that instantiates and invokes a PyTorch torch.nn.module and call turnkey on it. Read about model contributions here.

Plugins

The build tool has built-in support for a variety of interoperable Tools. If you need more, the TurnkeyML plugin API lets you add your own installable tools with any functionality you like:

> pip install -e my_custom_plugin
> turnkey -i my_model.py discover export-pytorch my-custom-tool --my-args

All of the built-in Tools are implemented against the plugin API. Check out the example plugins and the plugin API guide to learn more about creating an installable plugin.

Contributing

We are actively seeking collaborators from across the industry. If you would like to contribute to this project, please check out our contribution guide.

Maintainers

This project is sponsored by the ONNX Model Zoo special interest group (SIG). It is maintained by @danielholanda @jeremyfowers @ramkrishna @vgodsoe in equal measure. You can reach us by filing an issue.

License

This project is licensed under the Apache 2.0 License.

Attribution

TurnkeyML used code from other open source projects as a starting point (see NOTICE.md). Thank you Philip Colangelo, Derek Elkins, Jeremy Fowers, Dan Gard, Victoria Godsoe, Mark Heaps, Daniel Holanda, Brian Kurtz, Mariah Larwood, Philip Lassen, Andrew Ling, Adrian Macias, Gary Malik, Sarah Massengill, Ashwin Murthy, Hatice Ozen, Tim Sears, Sean Settle, Krishna Sivakumar, Aviv Weinstein, Xueli Xao, Bill Xing, and Lev Zlotnik for your contributions to that work.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

turnkeyml-4.0.4.tar.gz (157.0 kB view details)

Uploaded Source

Built Distribution

turnkeyml-4.0.4-py3-none-any.whl (809.8 kB view details)

Uploaded Python 3

File details

Details for the file turnkeyml-4.0.4.tar.gz.

File metadata

  • Download URL: turnkeyml-4.0.4.tar.gz
  • Upload date:
  • Size: 157.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for turnkeyml-4.0.4.tar.gz
Algorithm Hash digest
SHA256 dedd98076238d570ff4ee84356de4c93ffd20d60dd4564894e3de85769050384
MD5 3a501ac14b7cb04dde1f204f95f40152
BLAKE2b-256 dcb79e93c4b096e4f661edbd79649027a4f8342da9c117fc945f894e449d2d46

See more details on using hashes here.

File details

Details for the file turnkeyml-4.0.4-py3-none-any.whl.

File metadata

  • Download URL: turnkeyml-4.0.4-py3-none-any.whl
  • Upload date:
  • Size: 809.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for turnkeyml-4.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 a3b6abf9467da46a53067b7f8b5b95ef50f36ccdfb060363623c9d6b923b5724
MD5 58f014b06e46de7c67ff7639f32dfed0
BLAKE2b-256 2116ecf628b9cdbbbef22f70028657bd2f6b28c497cdf355caa33e8a217badb4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page