Airless is a package that aims to build a serverless and lightweight orchestration platform, creating workflows of multiple tasks being executed on FaaS platform
Project description
Airless
Airless is a package that aims to build a serverless and lightweight orchestration platform, creating workflows of multiple tasks being executed on Google Cloud Functions
Why not just use Apache Airflow?
Airflow is the industry standard when we talk about job orchestration and worflow management. However, in some cases, we believe it may not be the best solution. I would like to highlight 3 main cases we face that Airflow struggles to handle.
- Serverless
At the beginning of a project we want to avoid dealing with infrastructure since it demands time and it has a fixed cost to reserve an instance to run Airflow. Since we didn't have that many jobs, it didn't make sense to have an instance of Airflow up 24-7.
When the project starts to get bigger and, if we use Airflow's instance to run the tasks, we start facing performance issues on the workflow.
In order to avoid this problems we decided to build a 100% serverless platform.
- Parallel processing
The main use case we designed Airless for is for data scrappers. The problem with data scrappers is that normally you want them to process a lot of tasks in parallel, for instance, first you want to fetch a website and collect all links in that page and send them forward for another task to be executed and then that task does the same and so on and so forth.
Building this workflow that does not know before hand how many tasks are going to be executed is something hard be built on Airflow.
- Data sharing between tasks
In order to built this massive parallel processing workflow that we explained on the previous topic, we need to be able to dynamically create and send data to the next task. So use the data from the first task as a trigger and an input data for the next tasks.
How it works
Airless builts its workflows based on Google Cloud Functions, Google Pub/Sub and Google Cloud Scheduler.
- Everything starts with the Cloud Scheduler, which is a serverless product from Google Cloud that is able to publish a message to a Pub/Sub with a cron scheduler
- When a message is published to a Pub/Sub it can trigger a Cloud Function and get executed with that message as an input
- This Cloud Functions is able to publish as many messages as it wants to as many Pub/Sub topics as it wants
- Repeat from 2
Preparation
Environment variables
ENV
GCP_PROJECT
PUBSUB_TOPIC_ERROR
LOG_LEVEL
PUBSUB_TOPIC_EMAIL_SEND
PUBSUB_TOPIC_SLACK_SEND
BIGQUERY_DATASET_ERROR
BIGQUERY_TABLE_ERROR
EMAIL_SENDER_ERROR
EMAIL_RECIPIENTS_ERROR
SLACK_CHANNELS_ERROR
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file airless-0.0.73.tar.gz
.
File metadata
- Download URL: airless-0.0.73.tar.gz
- Upload date:
- Size: 22.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 330046c05b81d359344871178e0a9c50ce5e0227ca55717015833ebc70c66217 |
|
MD5 | 65b88c6b2546df533710903ca12c9744 |
|
BLAKE2b-256 | 54c7a003b516882a54aab4983c9e5dec757803638ff2311f8f902e41acc71bd8 |
File details
Details for the file airless-0.0.73-py3-none-any.whl
.
File metadata
- Download URL: airless-0.0.73-py3-none-any.whl
- Upload date:
- Size: 29.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5d0827430672a5331bb4e756ec741c2b3986b9915133aeb7e8d880f88cd0c562 |
|
MD5 | 3f39c2b19d33d5b73be46919560ea83f |
|
BLAKE2b-256 | a1fdeffbcddd5f76b88e68a17677d8963c18b3ed5f83d1893ab7aef56c58bb2b |