Skip to main content

Check, compile, upload, run, and schedule Kubeflow Pipelines on GCP Vertex AI in a standardized manner.

Project description


Vertex Pipelines Deployer

Artefact Logo

Deploy Vertex Pipelines within minutes

This tool is a wrapper around kfp and google-cloud-aiplatform that allows you to check, compile, upload, run, and schedule Vertex Pipelines in a standardized manner.


PyPI - Python Version PyPI - Status PyPI - Downloads PyPI - License

CI Release

Pre-commit Linting: ruff Imports: isort

๐Ÿ“š Table of Contents
  1. Why this tool?
  2. Prerequisites
  3. Installation
    1. From git repo
    2. From Artifact Registry (not available in PyPI yet)
    3. Add to requirements
  4. Usage
    1. Setup
    2. Folder Structure
    3. CLI: Deploying a Pipeline with `deploy`
    4. CLI: Checking Pipelines are valid with `check`
    5. CLI: Other commands
      1. `config`
      2. `create`
      3. `init`
      4. `list`
  5. CLI: Options
  6. Configuration

Full CLI documentation

โ“ Why this tool?

Three use cases:

  1. CI: Check pipeline validity.
  2. Dev mode: Quickly iterate over your pipelines by compiling and running them in multiple environments (test, dev, staging, etc.) without duplicating code or searching for the right kfp/aiplatform snippet.
  3. CD: Deploy your pipelines to Vertex Pipelines in a standardized manner in your CD with Cloud Build or GitHub Actions.

Two main commands:

  • check: Check your pipelines (imports, compile, check configs validity against pipeline definition).
  • deploy: Compile, upload to Artifact Registry, run, and schedule your pipelines.

๐Ÿ“‹ Prerequisites

  • Unix-like environment (Linux, macOS, WSL, etc.)
  • Python 3.8 to 3.10
  • Google Cloud SDK
  • A GCP project with Vertex Pipelines enabled

๐Ÿ“ฆ Installation

From PyPI

pip install vertex-deployer

From git repo

Stable version:

pip install git+https://github.com/artefactory/vertex-pipelines-deployer.git@main

Develop version:

pip install git+https://github.com/artefactory/vertex-pipelines-deployer.git@develop

If you want to test this package on examples from this repo:

git clone git@github.com:artefactory/vertex-pipelines-deployer.git
poetry install
cd example

๐Ÿš€ Usage

๐Ÿ› ๏ธ Setup

  1. Setup your GCP environment:
export PROJECT_ID=<gcp_project_id>
gcloud config set project $PROJECT_ID
gcloud auth login
gcloud auth application-default login
  1. You need the following APIs to be enabled:
  • Cloud Build API
  • Artifact Registry API
  • Cloud Storage API
  • Vertex AI API
gcloud services enable \
    cloudbuild.googleapis.com \
    artifactregistry.googleapis.com \
    storage.googleapis.com \
    aiplatform.googleapis.com
  1. Create an artifact registry repository for your base images (Docker format):
export GAR_DOCKER_REPO_ID=<your_gar_repo_id_for_images>
export GAR_LOCATION=<your_gar_location>
gcloud artifacts repositories create ${GAR_DOCKER_REPO_ID} \
    --location=${GAR_LOCATION} \
    --repository-format=docker
  1. Build and upload your base images to the repository. To do so, please follow Google Cloud Build documentation.

  2. Create an artifact registry repository for your pipelines (KFP format):

export GAR_PIPELINES_REPO_ID=<your_gar_repo_id_for_pipelines>
gcloud artifacts repositories create ${GAR_PIPELINES_REPO_ID} \
    --location=${GAR_LOCATION} \
    --repository-format=kfp
  1. Create a GCS bucket for Vertex Pipelines staging:
export GCP_REGION=<your_gcp_region>
export VERTEX_STAGING_BUCKET_NAME=<your_bucket_name>
gcloud storage buckets create gs://${VERTEX_STAGING_BUCKET_NAME} --location=${GCP_REGION}
  1. Create a service account for Vertex Pipelines:
export VERTEX_SERVICE_ACCOUNT_NAME=foobar
export VERTEX_SERVICE_ACCOUNT="${VERTEX_SERVICE_ACCOUNT_NAME}@${PROJECT_ID}.iam.gserviceaccount.com"

gcloud iam service-accounts create ${VERTEX_SERVICE_ACCOUNT_NAME}

gcloud projects add-iam-policy-binding ${PROJECT_ID} \
    --member="serviceAccount:${VERTEX_SERVICE_ACCOUNT}" \
    --role="roles/aiplatform.user"

gcloud storage buckets add-iam-policy-binding gs://${VERTEX_STAGING_BUCKET_NAME} \
    --member="serviceAccount:${VERTEX_SERVICE_ACCOUNT}" \
    --role="roles/storage.objectUser"

gcloud artifacts repositories add-iam-policy-binding ${GAR_PIPELINES_REPO_ID} \
   --location=${GAR_LOCATION} \
   --member="serviceAccount:${VERTEX_SERVICE_ACCOUNT}" \
   --role="roles/artifactregistry.admin"

You can use the deployer CLI (see example below) or import VertexPipelineDeployer in your code (try it yourself).

๐Ÿ“ Folder Structure

You must respect the following folder structure. If you already follow the Vertex Pipelines Starter Kit folder structure, it should be pretty smooth to use this tool:

vertex
โ”œโ”€ configs/
โ”‚  โ””โ”€ {pipeline_name}
โ”‚     โ””โ”€ {config_name}.json
โ””โ”€ pipelines/
   โ””โ”€ {pipeline_name}.py

!!! tip "About folder structure" You must have at least these files. If you need to share some config elements between pipelines, you can have a shared folder in configs and import them in your pipeline configs.

If you're following a different folder structure, you can change the default paths in the `pyproject.toml` file.
See [Configuration](#configuration) section for more information.

Pipelines

Your file {pipeline_name}.py must contain a function called {pipeline_name} decorated using kfp.dsl.pipeline. In previous versions, the functions / object used to be called pipeline but it was changed to {pipeline_name} to avoid confusion with the kfp.dsl.pipeline decorator.

# vertex/pipelines/dummy_pipeline.py
import kfp.dsl

# New name to avoid confusion with the kfp.dsl.pipeline decorator
@kfp.dsl.pipeline()
def dummy_pipeline():
    ...

# Old name
@kfp.dsl.pipeline()
def pipeline():
    ...

Configs

Config file can be either .py, .json or .toml files. They must be located in the config/{pipeline_name} folder.

!!! question "Why not YAML?" YAML is not supported yet. Feel free to open a PR if you want to add it.

Why multiple formats?

.py files are useful to define complex configs (e.g. a list of dicts) while .json / .toml files are useful to define simple configs (e.g. a string). It also adds flexibility to the user and allows you to use the deployer with almost no migration cost.

How to format them?

  • .py files must be valid python files with two important elements:

    • parameter_values to pass arguments to your pipeline
    • input_artifacts if you want to retrieve and create input artifacts to your pipeline. See Vertex Documentation for more information.
  • .json files must be valid json files containing only one dict of key: value representing parameter values.

  • .toml files must be the same. Please note that TOML sections will be flattened, except for inline tables. Section names will be joined using "_" separator and this is not configurable at the moment. Example:

=== "TOML file" toml [modeling] model_name = "my-model" params = { lambda = 0.1 }

=== "Resulting parameter values" python { "modeling_model_name": "my-model", "modeling_params": { "lambda": 0.1 } } ??? question "Why are sections flattened when using TOML config files?" Vertex Pipelines parameter validation and parameter logging to Vertex Experiments are based on the parameter name. If you do not flatten your sections, you'll only be able to validate section names and that they should be of type dict.

Not very useful.

??? question "Why aren't input_artifacts supported in TOML / JSON config files?" Because it's low on the priority list. Feel free to open a PR if you want to add it.

How to name them?

{config_name}.py or {config_name}.json or {config_name}.toml. config_name is free but must be unique for a given pipeline.

Settings

You will also need the following ENV variables, either exported or in a .env file (see example in example.env):

PROJECT_ID=YOUR_PROJECT_ID  # GCP Project ID
GCP_REGION=europe-west1  # GCP Region

GAR_LOCATION=europe-west1  # Google Artifact Registry Location
GAR_PIPELINES_REPO_ID=YOUR_GAR_KFP_REPO_ID  # Google Artifact Registry Repo ID (KFP format)

VERTEX_STAGING_BUCKET_NAME=YOUR_VERTEX_STAGING_BUCKET_NAME  # GCS Bucket for Vertex Pipelines staging
VERTEX_SERVICE_ACCOUNT=YOUR_VERTEX_SERVICE_ACCOUNT  # Vertex Pipelines Service Account

!!! note "About env files" We're using env files and dotenv to load the environment variables. No default value for --env-file argument is provided to ensure that you don't accidentally deploy to the wrong project. An example.env file is provided in this repo. This also allows you to work with multiple environments thanks to env files (test.env, dev.env, prod.env, etc)

๐Ÿš€ CLI: Deploying a Pipeline with deploy

Let's say you defined a pipeline in dummy_pipeline.py and a config file named config_test.json. You can deploy your pipeline using the following command:

vertex-deployer deploy dummy_pipeline \
    --compile \
    --upload \
    --run \
    --env-file example.env \
    --tags my-tag \
    --config-filepath vertex/configs/dummy_pipeline/config_test.json \
    --experiment-name my-experiment \
    --enable-caching \
    --skip-validation

โœ… CLI: Checking Pipelines are valid with check

To check that your pipelines are valid, you can use the check command. It uses a pydantic model to:

  • check that your pipeline imports and definition are valid
  • check that your pipeline can be compiled
  • check that all configs related to the pipeline are respecting the pipeline definition (using a Pydantic model based on pipeline signature)

To validate one or multiple pipeline(s):

vertex-deployer check dummy_pipeline <other pipeline name>

To validate all pipelines in the vertex/pipelines folder:

vertex-deployer check --all

๐Ÿ› ๏ธ CLI: Other commands

config

You can check your vertex-deployer configuration options using the config command. Fields set in pyproject.toml will overwrite default values and will be displayed differently:

vertex-deployer config --all

create

You can create all files needed for a pipeline using the create command:

vertex-deployer create my_new_pipeline --config-type py

This will create a my_new_pipeline.py file in the vertex/pipelines folder and a vertex/config/my_new_pipeline/ folder with multiple config files in it.

init

To initialize the deployer with default settings and folder structure, use the init command:

vertex-deployer init
$ vertex-deployer init
Welcome to Vertex Deployer!
This command will help you getting fired up.
Do you want to configure the deployer? [y/n]: n
Do you want to build default folder structure [y/n]: n
Do you want to create a pipeline? [y/n]: n
All done โœจ

list

You can list all pipelines in the vertex/pipelines folder using the list command:

vertex-deployer list --with-configs

๐Ÿญ CLI: Options

vertex-deployer --help

To see package version:

vertex-deployer --version

To adapt log level, use the --log-level option. Default is INFO.

vertex-deployer --log-level DEBUG deploy ...

Configuration

You can configure the deployer using the pyproject.toml file to better fit your needs. This will overwrite default values. It can be useful if you always use the same options, e.g. always the same --scheduler-timezone

[tool.vertex-deployer]
vertex_folder_path = "my/path/to/vertex"
log_level = "INFO"

[tool.vertex-deployer.deploy]
scheduler_timezone = "Europe/Paris"

You can display all the configurable parameterss with default values by running:

$ vertex-deployer config --all
'*' means the value was set in config file

* vertex_folder_path=my/path/to/vertex
* log_level=INFO
deploy
  env_file=None
  compile=True
  upload=False
  run=False
  schedule=False
  cron=None
  delete_last_schedule=False
  * scheduler_timezone=Europe/Paris
  tags=['latest']
  config_filepath=None
  config_name=None
  enable_caching=False
  experiment_name=None
check
  all=False
  config_filepath=None
  raise_error=False
list
  with_configs=True
create
  config_type=json

Repository Structure

โ”œโ”€ .github
โ”‚  โ”œโ”€ ISSUE_TEMPLATE/
โ”‚  โ”œโ”€ workflows
โ”‚  โ”‚  โ”œโ”€ ci.yaml
โ”‚  โ”‚  โ”œโ”€ pr_agent.yaml
โ”‚  โ”‚  โ””โ”€ release.yaml
โ”‚  โ”œโ”€ CODEOWNERS
โ”‚  โ””โ”€ PULL_REQUEST_TEMPLATE.md
โ”œโ”€ deployer                                     # Source code
โ”‚  โ”œโ”€ __init__.py
โ”‚  โ”œโ”€ cli.py
โ”‚  โ”œโ”€ constants.py
โ”‚  โ”œโ”€ pipeline_checks.py
โ”‚  โ”œโ”€ pipeline_deployer.py
โ”‚  โ”œโ”€ settings.py
โ”‚  โ””โ”€ utils
โ”‚     โ”œโ”€ config.py
โ”‚     โ”œโ”€ console.py
โ”‚     โ”œโ”€ exceptions.py
โ”‚     โ”œโ”€ logging.py
โ”‚     โ”œโ”€ models.py
โ”‚     โ””โ”€ utils.py
โ”œโ”€ docs/                                        # Documentation folder (mkdocs)
โ”œโ”€ templates/                                   # Semantic Release templates
โ”œโ”€ tests/
โ”œโ”€ example                                      # Example folder with dummy pipeline and config
|   โ”œโ”€ example.env
โ”‚   โ””โ”€ vertex
โ”‚      โ”œโ”€ components
โ”‚      โ”‚  โ””โ”€ dummy.py
โ”‚      โ”œโ”€ configs
โ”‚      โ”‚  โ”œโ”€ broken_pipeline
โ”‚      โ”‚  โ”‚  โ””โ”€ config_test.json
โ”‚      โ”‚  โ””โ”€ dummy_pipeline
โ”‚      โ”‚     โ”œโ”€ config_test.json
โ”‚      โ”‚     โ”œโ”€ config.py
โ”‚      โ”‚     โ””โ”€ config.toml
โ”‚      โ”œโ”€ deployment
โ”‚      โ”œโ”€ lib
โ”‚      โ””โ”€ pipelines
โ”‚         โ”œโ”€ broken_pipeline.py
โ”‚         โ””โ”€ dummy_pipeline.py
โ”œโ”€ .gitignore
โ”œโ”€ .pre-commit-config.yaml
โ”œโ”€ catalog-info.yaml                            # Roadie integration configuration
โ”œโ”€ CHANGELOG.md
โ”œโ”€ CONTRIBUTING.md
โ”œโ”€ LICENSE
โ”œโ”€ Makefile
โ”œโ”€ mkdocs.yml                                   # Mkdocs configuration
โ”œโ”€ pyproject.toml
โ””โ”€ README.md

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vertex_deployer-0.5.0.tar.gz (39.3 kB view hashes)

Uploaded Source

Built Distribution

vertex_deployer-0.5.0-py3-none-any.whl (41.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page