Skip to main content

A simple workflow framework. Hamilton + APScheduler = FlowerPower

Project description

FlowerPower

Bild

FlowerPower is a simple workflow framework based on the fantastic Hamilton and Advanced Python Scheduler - APScheduler

Installation

pip install "flowerpower" 
# with scheduler
pip install "flowerpower[scheduler]" 
# with mqtt event broker
pip install "flowerpower[scheduler,mqtt]" 
# with redis event broker
pip install "flowerpower[scheduler,redis]" 
# with mongodb data store
pip install "flowerpower[scheduler,mongodb]"
# with ray distributed computing
pip install "flowerpower[scheduler,ray]"
# with dask distributed computing
pip install "flowerpower[scheduler,dask]"

Usage

0) Optional: Dev Services

curl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/Dockerfile
curl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/docker-compose.yml

# Hamilton UI, which allows you to track and visualize your pipelines
docker-compose up hamilton_ui -d 
# jupyterlab and code-server
docker-compose up jupytercode -d 
# s3 compatible object storage
docker-compose up minio -d 
#  mosquitto mqtt broker if you want to use mqtt as the event broker
docker-compose up mqtt -d 
# valkey (OSS redis) if you want to use redis as the event broker
docker-compose up redis -d 
# mongodb if you want to use mongodb as the data store
docker-compose up mongodb -d 
 # postgres db if you want to use postgres as data store and/or event broker. This db is also used for hamilton ui
docker-compose up postgres -d

a) Initialze a new flowerpower project

flowerpower init new-project
cd new-project

This adds basic config files conf/pipelines.yml, conf/scheduler.yml and conf/tracker.yml

b) Add a new pipeline

flowerpoweradd-pipeline my_flow

A new file pipelines/my_flow.py is created and the relevant entries are added to the config files.

c) Setup the new pipeline

Edit pipelines/my_flow.py and add the pipeline functions.

FlowerPower uses Hamilton that converts your pipeline functions into nodes and then creates a Directed Acyclic Graph (DAG).

It is therefore mandatory to write your pipeline files according to the Hamilton paradigm. You can read more about this in the Hamilton documentaion chapter Function, Nodes and DataFlow

Optinally edit the config files conf/pipelines.yml, conf/scheduler.yml and conf/tracker.yml

d) Run or Scheduler the new pipeline

flowerpower run-pipeline my_flow
# or schedule with a 30 seconds interval
flowerpower schedule-pipeline my_flow interval --interval-params seconds=30 --auto-start

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowerpower-0.3.4.tar.gz (2.1 MB view hashes)

Uploaded Source

Built Distribution

flowerpower-0.3.4-py3-none-any.whl (15.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page