Skip to main content

A simple workflow framework. Hamilton + APScheduler = FlowerPower

Project description

FlowerPower

Bild

FlowerPower is a simple workflow framework based on the fantastic Hamilton and Advanced Python Scheduler - APScheduler

Installation

pip install "flowerpower" 
# with scheduler
pip install "flowerpower[scheduler]" 
# with mqtt event broker
pip install "flowerpower[scheduler,mqtt]" 
# with redis event broker
pip install "flowerpower[scheduler,redis]" 
# with mongodb data store
pip install "flowerpower[scheduler,mongodb]"
# with ray distributed computing
pip install "flowerpower[scheduler,ray]"
# with dask distributed computing
pip install "flowerpower[scheduler,dask]"

Usage

0) Optional: Dev Services

curl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/Dockerfile
curl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/docker-compose.yml

# Hamilton UI, which allows you to track and visualize your pipelines
docker-compose up hamilton_ui -d 
# jupyterlab and code-server
docker-compose up jupytercode -d 
# s3 compatible object storage
docker-compose up minio -d 
#  mosquitto mqtt broker if you want to use mqtt as the event broker
docker-compose up mqtt -d 
# valkey (OSS redis) if you want to use redis as the event broker
docker-compose up redis -d 
# mongodb if you want to use mongodb as the data store
docker-compose up mongodb -d 
 # postgres db if you want to use postgres as data store and/or event broker. This db is also used for hamilton ui
docker-compose up postgres -d

a) Initialze a new flowerpower project

flowerpower init new-project
cd new-project

This adds basic config files conf/pipelines.yml, conf/scheduler.yml and conf/tracker.yml

b) Add a new pipeline

flowerpoweradd-pipeline my_flow

A new file pipelines/my_flow.py is created and the relevant entries are added to the config files.

c) Setup the new pipeline

Edit pipelines/my_flow.py and add the pipeline functions.

FlowerPower uses Hamilton that converts your pipeline functions into nodes and then creates a Directed Acyclic Graph (DAG).

It is therefore mandatory to write your pipeline files according to the Hamilton paradigm. You can read more about this in the Hamilton documentaion chapter Function, Nodes and DataFlow

Optinally edit the config files conf/pipelines.yml, conf/scheduler.yml and conf/tracker.yml

d) Run or Scheduler the new pipeline

flowerpower run-pipeline my_flow
# or schedule with a 30 seconds interval
flowerpower schedule-pipeline my_flow interval --interval-params seconds=30 --auto-start

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowerpower-0.3.6.tar.gz (2.1 MB view details)

Uploaded Source

Built Distribution

flowerpower-0.3.6-py3-none-any.whl (20.8 kB view details)

Uploaded Python 3

File details

Details for the file flowerpower-0.3.6.tar.gz.

File metadata

  • Download URL: flowerpower-0.3.6.tar.gz
  • Upload date:
  • Size: 2.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for flowerpower-0.3.6.tar.gz
Algorithm Hash digest
SHA256 6da2014c388c05d88c20be17a0f78c97080e555fbfc0ba7ee88f17248776eb61
MD5 8099bc64510096f383f18a8d0f63fa08
BLAKE2b-256 0bf92a0d0804816ea3d82c3e271721a6e42ea68a9186fd7c3681f199dbcae601

See more details on using hashes here.

File details

Details for the file flowerpower-0.3.6-py3-none-any.whl.

File metadata

  • Download URL: flowerpower-0.3.6-py3-none-any.whl
  • Upload date:
  • Size: 20.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for flowerpower-0.3.6-py3-none-any.whl
Algorithm Hash digest
SHA256 ae040d609dd3a5af53a5bf7bc21518c37483bc6bdc2d0e7fcd86aa51f225b914
MD5 8fda5426f0f4466a473010165ac308be
BLAKE2b-256 54a4623b3e46a99c5de00bf53b7c9db28080a8cf50e8e3199976fcdd12b2ebb9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page