Skip to main content

Tools for tasks embedded in an AWS Step Functions state machine. This is a helper library for py2sfn.

Project description

py2sfn-task-tools

License

Tools for tasks embedded in an AWS Step Functions state machine. This is a helper library for py2sfn.

Features:

  • Offload state data to DynamoDB/S3 instead of storing data in the very constrained state machine input data object
  • Cancel the currently executing workflow

Table of Contents:

Installation

py2sfn-task-tools requires Python 3.6 or above. It should be installed in a py2sfn task entry point.

pip install py2sfn-task-tools

Guide

Once the py2sfn-task-tools library is installed, a Context should be created and passed to the tasks. Each py2sfn task will then have a context object to work with.

Stopping the execution

If you need to stop/cancel/abort the execution from within a task, you can use the context.stop_execution method within your task's run method. A common use case is if you need to check the value of a feature flag at the beginning of the execution and abort if it's false. For example:

if not some_condition:
    return await context.stop_execution()

You can provide extra detail by passing error and cause keyword arguments to the stop_execution method. The error is a short string like a code or enum value whereas cause is a longer description.

Working with the State Data Client

One of the stated Step Functions best practices is to avoid passing large payloads between states; the input data limit is only 32K characters. To get around this, you can choose to store data from your task code in a DynamoDB table. With DynamoDB, we have an item limit of 400KB to work with. When you put items into the table you receive a pointer to the DynamoDB item which you can return from your task so it gets includes in the input data object. From there, since the pointer is in the data dict, you can reload the stored data in a downstream task. This library's StateDataClient class provides methods for putting and getting items from this DynamoDB table. It's available in your task's run method as context.state_data_client.

The client methods are split between "local" and "global" variants. Local methods operate on items stored within the project whereas global methods can operate on items that were stored from any project. Global methods require a fully-specified partition key (primary key, contains the execution ID) and table name to locate the item whereas local methods only need a simple key because the partition key and table name can be infered from the project automatically. The put_* methods return a dict with metadata about the location of the item, including the key, partition_key, and table_name. If you return this metadata object from a task, it will get put on the data object and you can call a get_* method later in the state machine.

Many methods also accept an optional index argument. This argument needs to be provided when getting/putting an item that was originally stored as part of a put_items or put_global_items call. Providing the index is usually only done within a map iteration task.

Below are a few of the more common methods:

put_item/put_items

The put_item method puts an item in the state store. It takes key, data, and index arguments. For example:

context.state_data_client.put_item("characters", {"name": "jerry"})
context.state_data_client.put_item("characters", {"name": "elaine"}, index=24)

Note that the item at the given array index doesn't actually have to exist in the table before you call put_item. However, if it doesn't exist then you may have a fan-out logic bug upstream in your state machine.

The put_items method puts an entire list of items into the state store. Each item will be stored separately under its corresponding array index. For example:

context.state_data_client.put_items("characters", [{"name": "jerry"}, {"name": "elaine"}])

get_item

The get_item method gets the data attribute from an item in the state store. It takes key and index arguments. For example:

context.state_data_client.get_item("characters")  # -> {"name": "jerry"}
context.state_data_client.get_item("characters", index=24)  # -> {"name": "elaine"}

get_item_for_map_iteration/get_global_item_for_map_iteration

The get_item_for_map_iteration method gets the data attribute from an item in the state store using the event object. This method only works when called within a map iterator task. For example, if the put_items example above was called in a task, and its value was given to a map state to fan out, we can use the get_item_for_map_iteration method within our iterator task to fetch each item:

# Iteration 0:
context.state_data_client.get_item_for_map_iteration(event)  # -> {"name": "jerry"}
# Iteration 1:
context.state_data_client.get_item_for_map_iteration(event)  # -> {"name": "elaine"}

This works because the map iterator state machine receives an input data object with the schema:

{
  "items_result_table_name": "<DynamoDB table for the project>",
  "items_result_partition_key": "<execution ID>:characters",
  "items_result_key": "characters",
  "context_index": "<array index>",
  "context_value.$": "1"
}

The get_item_for_map_iteration is a helper method that uses that input to locate the right item. The get_global_item_for_map_iteration method has the same signature. It should be called when you know that the array used to fan out could have come from another project (e.g. the map state is the first state in a state machine triggered by a subscription).

Development

To run functional tests, you need an AWS IAM account with permissions to:

  • Create/update/delete a DynamoDB table
  • Create/update/delete an S3 bucket

Set the following environment variables:

  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY
  • AWS_DEFAULT_REGION

To run tests:

tox

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

py2sfn_task_tools-1.0.2.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

py2sfn_task_tools-1.0.2-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file py2sfn_task_tools-1.0.2.tar.gz.

File metadata

  • Download URL: py2sfn_task_tools-1.0.2.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.7 Linux/5.13.0-1023-aws

File hashes

Hashes for py2sfn_task_tools-1.0.2.tar.gz
Algorithm Hash digest
SHA256 4c3f92171d9a7faa6c47670e5d5226c1c4f7aa861efed853a03f1ff7f9d666c6
MD5 274525cda62c3808607d4a20d7087878
BLAKE2b-256 246c7fbadbbdf1e865306f0ff992d31f44b0d4ad011bfc6068adb360430bae84

See more details on using hashes here.

File details

Details for the file py2sfn_task_tools-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: py2sfn_task_tools-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.7 Linux/5.13.0-1023-aws

File hashes

Hashes for py2sfn_task_tools-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 fae07ab90d12e009f23be5e629bb1a5acfd450401a081e4d0643ea7ba3c2bbd4
MD5 975be22d56d1d69920fabb746e1dfd39
BLAKE2b-256 6466d1df97348855fedebc3f3dc533fa6bf25465eb17d0e35c9c00aa4ece2a07

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page