Skip to main content

Google Workflows Emulator and testing utilities

Project description

Google Cloud Workflows Emulator

This is both a library and cli tool.

Using the CLI

Calling a workflow

  1. Create a workflow config for your project

    # small_config.workflow.yaml
    main:
      params: [ table_name ]
      steps:
        - assign_variables:
            assign:
              - table_parts: ${text.split(table_name, ".")}
              - project_id: ${sys.get_env("GOOGLE_CLOUD_PROJECT_ID")}
              - dataset_id: ${table_parts[-2]}
              - table_id: ${table_parts[-1]}
              - config:
                  project_id: ${project_id}
                  dataset_id: ${dataset_id}
                  table_id: ${table_id}
              - query: |
                  SELECT column_name
                  FROM `${project_id}.${dataset_id}.INFORMATION_SCHEMA.COLUMNS`
                  WHERE table_name = '${table_id}'
        - final:
            return: ${query}
    
    ⚠ NOTE
    Rember to use the Cloud Workflows Json schema in your IDE to get the correct syntax highlighting, autocompletion and error checking.
  2. Define your environment variables in the .env file. Alternatively you can pass a custom .env file to the emulator with the --env-file flag.

    # .env
    GOOGLE_CLOUD_PROJECT_ID=numbers-crunching-123
    
  3. To run a single workflow in the CLI:

    workflows-emulator \
      --config test/data/small_config.workflow.yaml \
      run \
      --data='{"var_content": "lowercase text"}'
    

    To start the server emulating the Google Cloud service:

    workflows-emulator --config test/data/small_config.workflow.yaml serve
    

    and then call the server with a POST request:

    curl --request 'POST' \
       --header 'Content-Type: application/json' \
       --data '{"argument": "{\"var_content\": \"hello\"}"}' \
       'http://localhost:8000/v1/projects/my_project/locations/europe-west4/workflows/small_config/executions'
    
  4. The output will be printed to the console

    Log step HELLO
    "HELLO"
    

Using the library for testing

Given this workflow

main:
  params: [ my_var ]
  steps:
     - call_subworkflow:
          call: addition
          args:
             operand_a: ${my_var}
             operand_b: 2
          result: added
     - final:
          return: ${added}

addition:
  params: [ operand_a, operand_b ]
  steps:
    - log:
        call: sys.log
        args:
           text: ${"Adding " + string(operand_a) + " and " + string(operand_b)}
    - process:
        return: ${operand_a + operand_b}

You can write unit tests for it like so:

import os
from workflows_emulator.main import (
   execute_step, load_workflow,
   execute_workflow, execute_subworkflow,
   get_step,
)

def test_load_workflow():
   """Fails if syntax is wrong"""
   load_workflow('path/to/workflow.yaml')

def test_main_workflow():
   """Checks wether it calculates correctly"""
   config = load_workflow('path/to/workflow.yaml')
   params = {'my_var': 3}
   result = execute_workflow(config, params)
   assert result == 5


def test_subworkflow():
   config = load_workflow('path/to/workflow.yaml')
   params = {'operand_a': 1, 'operand_b': 2}
   result = execute_subworkflow(config['addition'], params)
   assert result == 3


def test_step():
   config = load_workflow('path/to/workflow.yaml')
   subworkflow_step_list = config['addition']['steps']
   ok, _index, step_config = get_step(subworkflow_step_list, 'process')
   assert ok
   context = {'operand_a': 1, 'operand_b': 2}
   _context, _next, result = execute_step(step_config, context)
   assert result == 3

Reason behind this emulator

How to develop and debug your workflows according to Google Cloud

Running a Workflow goes like this:

WORKFLOW_NAME=my-workflow-name
WORKFLOW_FILE_PATH=my_workflow.yaml

gcloud workflows deploy ${WORKFLOW_NAME} \
  --location=europe-west4 \
  --call-log-level=log-all-calls \
  --source=${WORKFLOW_FILE_PATH}

gcloud workflows run ${WORKFLOW_NAME} \
  --location=europe-west4 \
  --call-log-level=log-all-calls

Update connectors

Workflows provides a set of connectors to interact with Google Cloud REST APIs easier. They are listed in the documentation. If new connectors are added, they can be refreshed running:

get-discovery-documents

Then open a PR with the newly generated files.

Not implemented (yet)

Some of the std lib modules are not implemented as the behavior is difficult to mimic, or it is work-in-progress:

  • experimental.executions — it's use is discouraged in the docs
  • events — callbacks are complex to run locally

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

workflows_emulator-1.10.0.tar.gz (1.6 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

workflows_emulator-1.10.0-py3-none-any.whl (1.7 MB view details)

Uploaded Python 3

File details

Details for the file workflows_emulator-1.10.0.tar.gz.

File metadata

  • Download URL: workflows_emulator-1.10.0.tar.gz
  • Upload date:
  • Size: 1.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.5 CPython/3.12.3 Linux/6.8.0-1017-azure

File hashes

Hashes for workflows_emulator-1.10.0.tar.gz
Algorithm Hash digest
SHA256 f5f19dc5146e6851ff5454261ca682747a34e8246f5ca8847e663c248dfbaadf
MD5 eb1160281e6573829606d08b137416d1
BLAKE2b-256 b8df49f6ecf7ade34259ae7bdd1f1d8def4ce0f3953e8790aaf30df7fce398d9

See more details on using hashes here.

File details

Details for the file workflows_emulator-1.10.0-py3-none-any.whl.

File metadata

  • Download URL: workflows_emulator-1.10.0-py3-none-any.whl
  • Upload date:
  • Size: 1.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.5 CPython/3.12.3 Linux/6.8.0-1017-azure

File hashes

Hashes for workflows_emulator-1.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b2227af3cd2bcf0e8f55e1384c833b991f23abf4f30ca46f46f7622b98cc2b0d
MD5 ba555d5c31802220c40e5ec34a8029f8
BLAKE2b-256 31220d41f2d9fd8b5008d0512f31e78804f70c9507be007ad9bd08fdee80a784

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page