Skip to main content

A strange programming framework

Project description

Pipeline Component System (PCS)

A strange programming framework

The name is inspired by Entity Component System (ECS)

Why?

I often find myself not liking the programs I create, and then end up rewriting them to be better, but they still end up quite brittle. This is a programming framework to make code cleaner and hopefully more maintainable. Have I succeeded in my goal? I have been using this framework for some time now and it has definitely helped my development speed and mental fatigue a lot! So yes, it has helped me achieve my goal!

Introduction

I will first discuss the few simple elements which make up this framework, then connect them together, explaining choices I took along the way. If you wish to see an example of how this all ties together, look at examples/example.py.

Component

Think of the component as your global database. Each piece of persistent data (static or dyamic) is stored here. It is a dataclass, and it is the only dataclass (unless you want to nest them ofcourse). The reason for this design choice is that this way, we ALWAYS know where the data is. We do not have to guess which class owns what, unlike OOP soups.

The Component distinguishes between 2 data types: conf (constant / static / defined at the start then remains read-only after initialization) and runtime (dynamic data which changes during runtime). The config variables can only be of primitive types (a restriction which comes from omegaconf, which this project depends on).

@dataclass
class Config:  # Note: the name is not important
    i: int  # Only primitive types in the config class (whatever OmegaConf is capable of)
    f: float
    s: str
    result: float

@dataclass
class Dynamic:
    di: int  # Dynamic class can also take complex types

data = parse_arguments_cli(Config, Dynamic)
data.seal()  # Makes the `Config` part of the component read-only

print(data.i)  # Print's the Config class' 'i'
print(data.di)  # Print's the Runtime class' 'i'

We can print this data object and we can also serialize it with pickle. The types are necessary for the Config dataclass, and recommended for the Dynamic class. They are enfored in the Config class, but not the Dynamic.

Systems

Systems are functions, with parameter names equivalent to the fields in the component. That's it. An example system may look like the following:

def print_add_system(i: int, f: float):  # Note: the variable names match those in the component exactly
    print("Add System:", i + f)


def result_add_system(i: int, f: float, result: float):
    result = result + i + f
    return {"result": result}


def result_add_system2(i: int, f: float, result: float):
    return {"result": result + i + f}  # Note: the key matches the variable names in the component exactly

Take note of the return at the end of the last 2 systems. We will discuss this syntax in the Pipeline section.

Pipeline

A pipeline takes a component, and a list of systems, then automatically passes the fields of the component to the systems, and writes results back to the component.

An example pipeline looks like this:

pipeline = Pipeline(
    component,
    [
        print_add_system,
        result_add_system,
        result_add_system2,
    ]
)
pipeline.execute()
pipeline.execute()  # Execute pipeline a second time

When a system returns a dictionary, the keys of the dict are interpreted to be the names of the component variables to replace with the value of the respective key. So the final 2 systems in the Systems examples will replace the result field.

Note that this helps us avoid having to pass parameters around, as it is done automatically for us, which cleans up the code base tremendously, as we have a concise pipeline definition, and when we call Pipeline.execute, we execute the 3 functions sequentially.

Other handy tools

parse_arguments_cli will read your argvs using argparse and give you a component object ready to use. So you may run your file like so: file.py --args-files="file1.yaml,file2.yaml" --rest a=1 -r b=2. Consecutive files will overwrite the previous entries, and --rest/-r take precendence always, but each --rest takes precedence over the previous.

  • --args-files can be shortened to -f
  • --rest can be shortened to -r

Some pattern ideas

  • Nested pipelines
  • Pipeline in loops

Example

Look at and run example.py for better usage

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pipeline_component_system-0.5.1.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pipeline_component_system-0.5.1-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file pipeline_component_system-0.5.1.tar.gz.

File metadata

File hashes

Hashes for pipeline_component_system-0.5.1.tar.gz
Algorithm Hash digest
SHA256 4b9d0458f5d4d7b46e2f1eff5ecb744400c9a17fb24ae0cc0bf0b9c12651ca9a
MD5 c471a7a20cf145548dd09a55040127fc
BLAKE2b-256 0a09aeff6566aa21fc4ad90e14605480a5819604b880e3da89a3d892480fda9d

See more details on using hashes here.

Provenance

The following attestation bundles were made for pipeline_component_system-0.5.1.tar.gz:

Publisher: publish_to_pypi.yml on CowKeyMan/PCS

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pipeline_component_system-0.5.1-py3-none-any.whl.

File metadata

File hashes

Hashes for pipeline_component_system-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4bc112503c6a1882ae93376d19d97530d2bf1cc73cd0c27678d77080a6ccbc9e
MD5 20b108dcee3bf7a66af14be62e60e679
BLAKE2b-256 31358303336fcd2524ed7cc406e890cfeccea264e96c499622cfad46a931f77c

See more details on using hashes here.

Provenance

The following attestation bundles were made for pipeline_component_system-0.5.1-py3-none-any.whl:

Publisher: publish_to_pypi.yml on CowKeyMan/PCS

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page