Skip to main content

A library for building runnable asynchronous trees

Project description

A simple library for building runnable async trees. Trees are a web of interconnected Nodes, which contain code to be run. A node can only start executing once all it's parents have finished running.

Features

  • The number of workers is automatically managed - although you can parametrize this
  • Trees can have any shape - including multiple roots - and can be dynamically altered during their runtime
  • State can be passed between nodes manually (as lambda functions) or via forwarding
  • Yielding coroutines produce Chunk objects that wrap intermediate results with the node's UUID

Installation

  • pip install grafo to install on your environment
  • pytest to run tests, add -s flag for tests to run print statements

Use

Basic tree execution

async def my_coroutine():
    return "result"

root_node = Node(coroutine=my_coroutine, uuid="root")
child_node = Node(coroutine=my_coroutine, uuid="child")

await root_node.connect(child_node)

executor = TreeExecutor(uuid="My Tree", roots=[root_node])
result = await executor.run()

Yielding intermediate results

async def yielding_coroutine():
    for i in range(3):
        yield f"progress {i}"
    yield "completed"

node = Node(coroutine=yielding_coroutine)
executor = TreeExecutor(roots=[node])

async for item in executor.yielding():
    if isinstance(item, Node):
        print(f"Node {item.uuid} completed")
    else:  # Chunk
        print(f"Intermediate: {item.output}")

Evaluating coroutine kwargs during runtime (manual forwarding)

node = Node(
    coroutine=my_coroutine,
    kwargs=dict(
        my_arg=lambda: get_dynamic_value()
    )
)

Forwarding output between nodes (automatic forwarding)

async def producer():
    return "data"

async def consumer(data: str):
    return f"processed_{data}"

node_a = Node(coroutine=producer, uuid="producer")
node_b = Node(coroutine=consumer, uuid="consumer")

await node_a.connect(node_b, forward="data")
# node_b will receive node_a's output as the 'data' argument

Type validation with generics

node = Node[str](coroutine=my_string_coroutine)
# The node will validate that the coroutine returns a string

Documentation

Full documentation is available at https://paulomtts.github.io/grafo/

To build the documentation locally:

pip install mkdocs mkdocs-material "mkdocstrings[python]"
mkdocs serve

Then visit http://localhost:8000

Developer's Zen

  1. Follow established nomenclature: a Node is a Node.
  2. Syntax sugar is sweet in moderation.
  3. Give the programmer granular control.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grafo-0.3.4.tar.gz (19.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

grafo-0.3.4-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file grafo-0.3.4.tar.gz.

File metadata

  • Download URL: grafo-0.3.4.tar.gz
  • Upload date:
  • Size: 19.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for grafo-0.3.4.tar.gz
Algorithm Hash digest
SHA256 7a2f24fcf67d44a561f0528afddf4222cdf644dbb1f51628cdc905921a9a3687
MD5 6d9ff8c2b5659db9ccebe05ad2add9fb
BLAKE2b-256 4efb4f5169b122540269e582160bb0f38aa0c168919757306678c37e195a3ba1

See more details on using hashes here.

File details

Details for the file grafo-0.3.4-py3-none-any.whl.

File metadata

  • Download URL: grafo-0.3.4-py3-none-any.whl
  • Upload date:
  • Size: 12.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for grafo-0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 ad1b960de81a71fafabeeae863888a4fb336876208b341b91303c5467eb1b1e8
MD5 4083f728a618b929533b6391fd5c3c06
BLAKE2b-256 115d0e1ac85b176d06bd343419545f32155faca9aaef662d68e94033dee258d5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page