Skip to main content

A library for building runnable asynchronous trees

Project description

A simple library for building runnable async trees. Trees are a web of interconnected Nodes, which contain code to be run. A node can only start executing once all it's parents have finished running.

Features

  • The number of workers is automatically managed - although you can parametrize this
  • Trees can have any shape - including multiple roots - and can be dynamically altered during their runtime
  • State can be passed between nodes manually (as lambda functions) or via forwarding
  • Yielding coroutines produce Chunk objects that wrap intermediate results with the node's UUID

Installation

  • pip install grafo to install on your environment
  • pytest to run tests, add -s flag for tests to run print statements

Use

Basic tree execution

async def my_coroutine():
    return "result"

root_node = Node(coroutine=my_coroutine, uuid="root")
child_node = Node(coroutine=my_coroutine, uuid="child")

await root_node.connect(child_node)

executor = TreeExecutor(uuid="My Tree", roots=[root_node])
result = await executor.run()

Yielding intermediate results

async def yielding_coroutine():
    for i in range(3):
        yield f"progress {i}"
    yield "completed"

node = Node(coroutine=yielding_coroutine)
executor = TreeExecutor(roots=[node])

async for item in executor.yielding():
    if isinstance(item, Node):
        print(f"Node {item.uuid} completed")
    else:  # Chunk
        print(f"Intermediate: {item.output}")

Evaluating coroutine kwargs during runtime (manual forwarding)

node = Node(
    coroutine=my_coroutine,
    kwargs=dict(
        my_arg=lambda: get_dynamic_value()
    )
)

Forwarding output between nodes (automatic forwarding)

async def producer():
    return "data"

async def consumer(data: str):
    return f"processed_{data}"

node_a = Node(coroutine=producer, uuid="producer")
node_b = Node(coroutine=consumer, uuid="consumer")

await node_a.connect(node_b, forward="data")
# node_b will receive node_a's output as the 'data' argument

Type validation with generics

node = Node[str](coroutine=my_string_coroutine)
# The node will validate that the coroutine returns a string

Documentation

Full documentation is available at https://paulomtts.github.io/grafo/

To build the documentation locally:

pip install mkdocs mkdocs-material "mkdocstrings[python]"
mkdocs serve

Then visit http://localhost:8000

Developer's Zen

  1. Follow established nomenclature: a Node is a Node.
  2. Syntax sugar is sweet in moderation.
  3. Give the programmer granular control.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grafo-0.3.3.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

grafo-0.3.3-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file grafo-0.3.3.tar.gz.

File metadata

  • Download URL: grafo-0.3.3.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for grafo-0.3.3.tar.gz
Algorithm Hash digest
SHA256 325a3a3246f71042659397d94dcdf8de79493e90ea55c43ca1ffbc6929c7b382
MD5 efe1d97c182f1a34b8d8107d8da1d6b1
BLAKE2b-256 009d74c74a49f6efffb83d4b92a568cc33e1cda491faca50d2ff542673ac39eb

See more details on using hashes here.

File details

Details for the file grafo-0.3.3-py3-none-any.whl.

File metadata

  • Download URL: grafo-0.3.3-py3-none-any.whl
  • Upload date:
  • Size: 12.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for grafo-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d7ad425dbc28bf13e2daa20db23086b47726ce2c151264c468ccb7bfe255166d
MD5 7fa82a12722933d1e5f4f77ca3e4670f
BLAKE2b-256 8839656b9ffd3c76e8b7de6af564c0e5b36c9b3a82a4ed819a323a695fe70ee5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page