Skip to main content

A simple workflow framework. Hamilton + APScheduler = FlowerPower

Project description

FlowerPower

Bild

FlowerPower is a simple workflow framework based on the fantastic Hamilton and Advanced Python Scheduler - APScheduler

Installation

pip install "flowerpower" 
# with scheduler
pip install "flowerpower[scheduler]" 
# with mqtt event broker
pip install "flowerpower[scheduler,mqtt]" 
# with redis event broker
pip install "flowerpower[scheduler,redis]" 
# with mongodb data store
pip install "flowerpower[scheduler,mongodb]"
# with ray distributed computing
pip install "flowerpower[scheduler,ray]"
# with dask distributed computing
pip install "flowerpower[scheduler,dask]"

Usage

0) Optional: Dev Services

curl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/Dockerfile
curl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/docker-compose.yml

# Hamilton UI, which allows you to track and visualize your pipelines
docker-compose up hamilton_ui -d 
# jupyterlab and code-server
docker-compose up jupytercode -d 
# s3 compatible object storage
docker-compose up minio -d 
#  mosquitto mqtt broker if you want to use mqtt as the event broker
docker-compose up mqtt -d 
# valkey (OSS redis) if you want to use redis as the event broker
docker-compose up redis -d 
# mongodb if you want to use mongodb as the data store
docker-compose up mongodb -d 
 # postgres db if you want to use postgres as data store and/or event broker. This db is also used for hamilton ui
docker-compose up postgres -d

a) Initialze a new flowerpower project

flowerpower init new-project
cd new-project

This adds basic config files conf/pipelines.yml, conf/scheduler.yml and conf/tracker.yml

b) Add a new pipeline

flowerpoweradd-pipeline my_flow

A new file pipelines/my_flow.py is created and the relevant entries are added to the config files.

c) Setup the new pipeline

Edit pipelines/my_flow.py and add the pipeline functions.

FlowerPower uses Hamilton that converts your pipeline functions into nodes and then creates a Directed Acyclic Graph (DAG).

It is therefore mandatory to write your pipeline files according to the Hamilton paradigm. You can read more about this in the Hamilton documentaion chapter Function, Nodes and DataFlow

Optinally edit the config files conf/pipelines.yml, conf/scheduler.yml and conf/tracker.yml

d) Run or Scheduler the new pipeline

flowerpower run-pipeline my_flow
# or schedule with a 30 seconds interval
flowerpower schedule-pipeline my_flow interval --interval-params seconds=30 --auto-start

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowerpower-0.3.4.tar.gz (2.1 MB view details)

Uploaded Source

Built Distribution

flowerpower-0.3.4-py3-none-any.whl (15.5 kB view details)

Uploaded Python 3

File details

Details for the file flowerpower-0.3.4.tar.gz.

File metadata

  • Download URL: flowerpower-0.3.4.tar.gz
  • Upload date:
  • Size: 2.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for flowerpower-0.3.4.tar.gz
Algorithm Hash digest
SHA256 4f39636fd1fc024ea7fdb90705f56bc360dcedaed59623bba9c68b826efb6a64
MD5 04fc9ead100b72eaa317e210332c7d6b
BLAKE2b-256 777ac7864edaf8a887752f288c8cd564dad75651147f7a0a4f2862f8a07f2df1

See more details on using hashes here.

File details

Details for the file flowerpower-0.3.4-py3-none-any.whl.

File metadata

  • Download URL: flowerpower-0.3.4-py3-none-any.whl
  • Upload date:
  • Size: 15.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for flowerpower-0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 b79a1df3264a502b6be3cdf3ade8749706bfd9928dfb32b0c4f8f5f6cd2d4f33
MD5 756e3611212417e8f999f2ee7104be84
BLAKE2b-256 64addf752496703cecc4f3edbf1de407c1574c4ed71e6f9c9c39940fb6786f74

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page