Light pipeline framework.
Project description
The light pipeline library pipely can execute any class or any sequence of classes in any order. Install with:
pip install pipely
1. Quick Start
To build a pipeline with executable classes, create a config .yaml file in the following format:
steps:
[step_name_1]:
exec: [relative path to a file]:[class to execute]
[step_name_2]:
exec: [relative path to a file]:[class to execute]
[step_name_3]:
exec: [relative path to a file]:[class to execute]
depends_on:
- [step_name_1]
- [step_name_2]
[steps]names should be unique;depends_ondefines order and enables pipely to detect independent steps and execute them in parallel;- the executable classes should have a
__call__method (see example below);
Then trigger the pipeline in cli:
python -m pipely from-pipeline <file.yaml> [dict.json]
<file.yaml>(required) is the path to a yaml config file (any format):../../file.yaml, orpath/to/file.yaml, orfile.yaml.[dict.json](optional argument) is the path to a shared dictionaryjsonfile if value exchange between classes is needed (more in Section 1.2.)
1.1. Example
Let's create a test.yaml config file:
steps:
a1_print:
exec: src/file1.py:firstA
a2_print:
exec: src/file1.py:secondA
final_print:
exec: src/file2.py:printDone
depends_on:
- a1_print
- a2_print
depends_on parameter sets the following order for pipely:
- firstly execute
a1_printanda2_printin parallel - and then execute
final_print
Let's look at executable classes. To use pipely, the executable classes should have a __call__ method, as shown below (check example/src/):
#example/src/file1.py
class firstA:
def run(self):
a = 32
print(a)
def __call__(self): #call method
self.run()
class secondA:
def run(self):
a = 12
print(a)
def __call__(self): #call method
self.run()
#example/src/file2.py
class printDone:
def run(self):
print("Done.")
def __call__(self): #call method
self.run()
To start pipely, type in cli:
python -m pipely from-pipeline test.yaml
1.2. Example w/ a shared dictionary
Let's create a testContext.yaml config file:
steps:
a_first:
exec: src/file1_shared.py:firstA
a_second:
exec: src/file1_shared.py:secondA
a_sum:
exec: src/file2_shared.py:aSum
depends_on:
- a_first
- a_second
a_sum_print:
exec: src/file3_shared.py:aSumPrint
depends_on:
- a_sum
depends_on parameter sets the following order for pipely:
- executes
a_firstanda_secondin parallel - then executes
a_sum, which sums up both a's in the previous steps - finally executes
a_sum_print, which prints the final result previously calculated ina_sum
Let's look at executale classes to understand how values are transferred between them (check example/src folder):
#example/src/file1_shared.py
class firstA:
def run(self):
a = 32
self.result = a
def __call__(self, context): #include context
self.run()
context["a1"] = self.result #to save into shared dictionary
class secondA:
def run(self):
a = 12
self.result = a
def __call__(self, context): #include context
self.run()
context["a2"] = self.result #to save into shared dictionary
Now we can use previously saved values a1 and a2 in another class, as shown below:
#example/src/file2_shared.py
class aSum:
def run(self, context): #include context
a1 = context["a1"] #to extract from shared dictionary
a2 = context["a2"] #to extract from shared dictionary
self.result = a1 + a2
def __call__(self, context): #include context
self.run(context) #to run the function
context["aSum"] = self.result #and save into shared dictionary
Then trigger the pipeline in cli with an optional second argument --context-path, which the path to a shared dictionary example_context.json:
python -m pipely from-pipeline testContext.yaml --context-path example_context.json
2. Imperative way
Pipely can also trigger a specific class from a specific .py file.
python -m pipely from-class <path/to/file.py>:<TestClass>
Below is an example of a command that triggers a printDone class from src/file4.py file.
python3 -m pipely from-class src/file4.py:printDone
If your class needs to operate on a shared dictionary, the command from-class could use an optional second argument --context-path. This argument awaits a path to a json representing the shared dictionary.
python -m pipely from-class src/file4.py:printShared --context-path example_context.json
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pipely-0.3.0.tar.gz.
File metadata
- Download URL: pipely-0.3.0.tar.gz
- Upload date:
- Size: 7.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
27daa1a2e93e331277093f7604982a1ef70e6ea8adba2afbeb3987306d7a0241
|
|
| MD5 |
d40d32a8b2855f9b22ca31278fa58649
|
|
| BLAKE2b-256 |
a00ab5110b59055c5e5004f2f1cbfee072adbe132eaf8e306d71c4eee5094b35
|
File details
Details for the file pipely-0.3.0-py3.9.egg.
File metadata
- Download URL: pipely-0.3.0-py3.9.egg
- Upload date:
- Size: 11.6 kB
- Tags: Egg
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5cf5f3ccf5c34aefd4af18b0fd0599e515a8059a9b69f1d8cfe3da46bfe5bd6f
|
|
| MD5 |
3251d487f08bfa915b3681e47beec786
|
|
| BLAKE2b-256 |
dc69e41ccad02fc62ef544ae7ad7da9935219b4f42d9c5b8a856dc92feab59a0
|