Skip to main content

ZenML: Write production-ready ML code.

Project description

PyPi PyPi PyPi Contributors License Build Interrogate


Logo

Build portable, production-ready MLOps pipelines.

A simple yet powerful open-source framework that scales your MLOps stack with your needs.
Explore the docs »

Join our Slack Slack Community and be part of the ZenML family.

Features · Roadmap · Report Bug · Vote New Features · Read Blog · Meet the Team
🎉 Version 0.13.0 is out. Check out the release notes here.

Logo Logo

🏁 Table of Contents
  1. Why ZenML?
  2. What is ZenML?
  3. Getting Started
  4. Collaborate with your team
  5. Learn More
  6. Roadmap
  7. Contributing and Community
  8. Meet the Team
  9. Getting Help
  10. License

🤖 Why ZenML?

🤹 Are you an ML engineer or data scientist shipping models to production and juggling a plethora of tools?

🤷‍♂️ Do you struggle with versioning data, code, and models in your projects?

👀 Have you had trouble replicating production pipelines and monitoring models in production?

✅ If you answered yes to any of the above, ZenML is here to help with all that and more...

Everyone loves to train ML models, but few talks about shipping them into production, and even fewer can do it well. At ZenML, we believe the journey from model development to production doesn't need to be long and painful.

The long journey from experimentation to production.

With ZenML, you can concentrate on what you do best - developing ML models and not worry about infrastructure or deployment tools.

If you come from unstructured notebooks or scripts with lots of manual processes, ZenML will make the path to production easier and faster for you and your team. Using ZenML allows you to own the entire pipeline - from experimentation to production.

This is why we built ZenML. Read more here.

💡 What is ZenML?

ZenML is an extensible, open-source MLOps framework for creating portable, production-ready MLOps pipelines. It's built for Data Scientists, ML Engineers, and MLOps Developers to collaborate as they develop to production.

ZenML offers a simple and flexible syntax, is cloud- and tool-agnostic, and has interfaces/abstractions catered toward ML workflows. With ZenML you'll have all your favorite tools in one place so you can tailor a workflow that caters to your specific needs.

ZenML unifies all your tools in one place.

Read more on all tools you can readily use in the integrations section. Can't find your tool? You can always write your own integration to use it with ZenML.

🤸 Getting Started

💾 Installation

Option 1 - Install ZenML via PyPI:

pip install zenml

Note - ZenML supports Python 3.7, 3.8, and 3.9.

Option 2 - If you’re feeling adventurous, try out the bleeding-edge installation:

pip install git+https://github.com/zenml-io/zenml.git@develop --upgrade

Warning - Fire dragons ahead. Proceed at your own risk!

Option 3 - Install via a Docker image hosted publicly on DockerHub:

docker run -it zenmldocker/zenml /bin/bash

Warning

Known installation issues for M1 Mac users

If you have an M1 Mac machine and encounter an installation error, try setting up brew and pyenv with Rosetta 2 and then install ZenML. The issue arises because some dependencies aren’t fully compatible with the vanilla ARM64 Architecture. The following links may be helpful (Thank you @Reid Falconer) :

🏇 First run

If you're here for the first time, we recommend running:

zenml go

This spins up a Jupyter notebook that walks you through various functionalities of ZenML at a high level.

By the end, you'll get a glimpse of how to use ZenML to:

  • Train, evaluate, deploy, and embed a model in an inference pipeline.
  • Automatically track and version data, models, and other artifacts.
  • Track model hyperparameters and metrics with experiment tracking tools.
  • Measure and visualize train-test skew, training-serving skew, and data drift.

👨‍🍳 Open Source MLOps Stack Recipes

ZenML boasts a ton of integrations into popular MLOps tools. The ZenML Stack concept ensures that these tools work nicely together, therefore bringing structure and standardization into the MLOps workflow.

However, ZenML assumes that the stack infrastructure for these tools is already provisioned. If you do not have deployed infrastructure, and want to quickly spin up combinations of tools on the cloud, the MLOps stack sister repository contains a series of Terraform-based recipes to provision such stacks. These recipes can be used directly with ZenML:

pip install zenml[stacks]

zenml stack recipe deploy <NAME_OF_STACK_RECIPE> --import

The above command not only provisions the given tools, but also automatically creates a ZenML stack with the configuration of the deployed recipe!

🍰 ZenBytes

New to MLOps? Get up to speed by visiting the ZenBytes repo.

ZenBytes is a series of short practical MLOps lessons taught using ZenML. It covers many of the core concepts widely used in ZenML and MLOps in general.

📜 ZenFiles

Already comfortable with ZenML and wish to elevate your pipeline into production mode? Check out ZenFiles.

ZenFiles is a collection of production-grade ML use-cases powered by ZenML. They are fully fleshed out, end-to-end projects that showcase ZenML's capabilities. They can also serve as a template from which to start similar projects.

👭 Collaborate with your team

ZenML is built to support teams working together. The underlying infrastructure on which your ML workflows run can be shared, as can the data, assets, and artifacts in your workflow.

In ZenML, a Stack represents a set of configurations for your MLOps tools and infrastructure. You can quickly share your ZenML stack with anyone by exporting the stack:

zenml stack export <STACK_NAME> <FILENAME.yaml>

Similarly, you can import a stack by running:

zenml stack import <STACK_NAME> <FILENAME.yaml>

Learn more on importing/exporting stacks here.

The ZenML Profiles offer an easy way to manage and switch between your stacks. All your stacks, components, and other classes of ZenML objects can be stored in a central location and shared across multiple users, teams, and automated systems such as CI/CD processes.

With the ZenServer you can deploy ZenML as a centralized service and connect entire teams and organizations to an easy-to-manage collaboration platform that provides a unified view of the MLOps processes, tools, and technologies that support your entire AI/ML project lifecycle.

Read more about using ZenML for collaboration here.

📖 Learn More

ZenML Resources Description
🧘‍♀️ ZenML 101 New to ZenML? Here's everything you need to know!
⚛️ Core Concepts Some key terms and concepts we use.
🚀 Our latest release New features, bug fixes.
🗳 Vote for Features Pick what we work on next!
📓 Docs Full documentation for creating your own ZenML pipelines.
📒 API Reference Detailed reference on ZenML's API.
🍰 ZenBytes A guided and in-depth tutorial on MLOps and ZenML.
🗂️️ ZenFiles End-to-end projects using ZenML.
👨‍🍳 MLOps Stacks Terraform based infrastructure recipes for pre-made ZenML stacks.
⚽️ Examples Learn best through examples where ZenML is used? We've got you covered.
📬 Blog Use cases of ZenML and technical deep dives on how we built it.
🔈 Podcast Conversations with leaders in ML, released every 2 weeks.
📣 Newsletter We build ZenML in public. Subscribe to learn how we work.
💬 Join Slack Need help with your specific use case? Say hi on Slack!
🗺 Roadmap See where ZenML is working to build new features.
🙋‍♀️ Contribute How to contribute to the ZenML project and code base.

🗺 Roadmap

ZenML is being built in public. The roadmap is a regularly updated source of truth for the ZenML community to understand where the product is going in the short, medium, and long term.

ZenML is managed by a core team of developers that are responsible for making key decisions and incorporating feedback from the community. The team oversees feedback via various channels, and you can directly influence the roadmap as follows:

🙌 Contributing and Community

We would love to develop ZenML together with our community! Best way to get started is to select any issue from the good-first-issue label. If you would like to contribute, please review our Contributing Guide for all relevant details.


Repobeats analytics image

👩‍👩‍👧‍👦 Meet the Team

Meet the Team

Have a question that's too hard to express on our Slack? Is it just too much effort to say everything on a long GitHub issue? Or are you just curious about what ZenML has been up to in the past week? Well, register now for the ZenML Office (Half) Hour to get your answers and more! It's free and open to everyone.

Every week, part of the ZenML core team will pop in for 30 minutes to interact directly with the community. Sometimes we'll be presenting a feature. Other times we just take questions and have fun. Join us if you are curious about ZenML, or just want to talk shop about MLOps.

We will host the gathering every Wednesday 8:30AM PT (5:30PM CET). Register now through this link, or subscribe to the public events calendar to get notified before every community gathering.

🆘 Getting Help

The first point of call should be our Slack group. Ask your questions about bugs or specific use cases, and someone from the core team will respond. Or, if you prefer, open an issue on our GitHub repo.

📜 License

ZenML is distributed under the terms of the Apache License Version 2.0. A complete version of the license is available in the LICENSE file in this repository. Any contribution made to this project will be licensed under the Apache License Version 2.0.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zenml-0.13.0.tar.gz (557.1 kB view hashes)

Uploaded Source

Built Distribution

zenml-0.13.0-py3-none-any.whl (877.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page