Skip to main content

A library for building runnable asynchronous trees

Project description

A simple library for building runnable async trees. Trees are a web of interconnected Nodes, which contain code to be run. A node can only start executing once all it's parents have finished running.

Features

  • The number of workers is automatically managed - although you can parametrize this
  • Trees can have any shape - including multiple roots - and can be dynamically altered during their runtime
  • State can be passed between nodes manually (as lambda functions) or via forwarding
  • Yielding coroutines produce Chunk objects that wrap intermediate results with the node's UUID

Installation

  • pip install grafo to install on your environment
  • pytest to run tests, add -s flag for tests to run print statements

Use

Basic tree execution

async def my_coroutine():
    return "result"

root_node = Node(coroutine=my_coroutine, uuid="root")
child_node = Node(coroutine=my_coroutine, uuid="child")

await root_node.connect(child_node)

executor = TreeExecutor(uuid="My Tree", roots=[root_node])
result = await executor.run()

Yielding intermediate results

async def yielding_coroutine():
    for i in range(3):
        yield f"progress {i}"
    yield "completed"

node = Node(coroutine=yielding_coroutine)
executor = TreeExecutor(roots=[node])

async for item in executor.yielding():
    if isinstance(item, Node):
        print(f"Node {item.uuid} completed")
    else:  # Chunk
        print(f"Intermediate: {item.output}")

Evaluating coroutine kwargs during runtime (manual forwarding)

node = Node(
    coroutine=my_coroutine,
    kwargs=dict(
        my_arg=lambda: get_dynamic_value()
    )
)

Forwarding output between nodes (automatic forwarding)

async def producer():
    return "data"

async def consumer(data: str):
    return f"processed_{data}"

node_a = Node(coroutine=producer, uuid="producer")
node_b = Node(coroutine=consumer, uuid="consumer")

await node_a.connect(node_b, forward="data")
# node_b will receive node_a's output as the 'data' argument

Type validation with generics

node = Node[str](coroutine=my_string_coroutine)
# The node will validate that the coroutine returns a string

Documentation

Full documentation is available at https://paulomtts.github.io/grafo/

To build the documentation locally:

pip install mkdocs mkdocs-material "mkdocstrings[python]"
mkdocs serve

Then visit http://localhost:8000

Developer's Zen

  1. Follow established nomenclature: a Node is a Node.
  2. Syntax sugar is sweet in moderation.
  3. Give the programmer granular control.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grafo-0.3.2.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

grafo-0.3.2-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file grafo-0.3.2.tar.gz.

File metadata

  • Download URL: grafo-0.3.2.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for grafo-0.3.2.tar.gz
Algorithm Hash digest
SHA256 800ed1d102068812d91836c07d34d96c5ae358b441be18fc971c245bdbdb2d6e
MD5 290788d783e1013b0349a68d30185749
BLAKE2b-256 d6eab0b51a61962c0187ed8d23514db052136354b686261c53e880732246c7a9

See more details on using hashes here.

File details

Details for the file grafo-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: grafo-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 12.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for grafo-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 655eabd5a2f33e3e26080827ec78557892074e7e6c8ac904dae965b3c1ee009c
MD5 f548fd5675f2836c0afd1aecf61f8cf6
BLAKE2b-256 4cd29a2100c56814c78dde737b424b65b53d539a9b1f41feb8a9eca3f2ab12ac

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page