Graph execution engine for agentic AI workflows.
Project description
Graphon
Graphon is a Python graph execution engine for agentic AI workflows.
The repository is still evolving, but it already contains a working execution engine, built-in workflow nodes, model runtime abstractions, integration protocols, and a runnable end-to-end example.
Highlights
- Queue-based
GraphEngineorchestration with event-driven execution - Graph parsing, validation, and fluent graph building
- Shared runtime state, variable pool, and workflow execution domain models
- Built-in node implementations for common workflow patterns
- Pluggable model runtime interfaces, including a local
SlimRuntime - HTTP, file, tool, and human-input integration protocols
- Extensible engine layers and external command channels
Repository modules currently cover node types such as start, end, answer,
llm, if-else, code, template-transform, question-classifier,
http-request, tool, variable-aggregator, variable-assigner, loop,
iteration, parameter-extractor, document-extractor, list-operator, and
human-input.
Quick Start
Graphon is currently easiest to evaluate from a source checkout.
Requirements
- Python 3.12 or 3.13
uvmake
Python 3.14 is currently unsupported because unstructured, which backs part
of the document extraction stack, currently declares Requires-Python: <3.14.
Set up the repository
make dev
source .venv/bin/activate
make test
make dev installs the project, syncs development dependencies, and sets up
prek Git hooks.
Run the Example Workflow
The repository includes a minimal runnable example at
examples/graphon_openai_slim.
It builds and executes this workflow:
start -> llm -> output
To run it:
make dev
source .venv/bin/activate
cd examples/graphon_openai_slim
cp .env.example .env
python3 workflow.py "Explain Graphon in one short sentence."
Before running the example, fill in the required values in .env.
The example currently expects:
- an
OPENAI_API_KEY - a
SLIM_PLUGIN_ID - a local
dify-plugin-daemon-slimsetup or equivalent Slim runtime
For the exact environment variables and runtime notes, see examples/graphon_openai_slim/README.md.
How Graphon Fits Together
At a high level, Graphon usage looks like this:
- Build or load a graph and instantiate nodes into a
Graph. - Prepare
GraphRuntimeStateand seed theVariablePool. - Configure model, file, HTTP, tool, or human-input adapters as needed.
- Run
GraphEngineand consume emitted graph events. - Read final outputs from runtime state.
The bundled example follows exactly that path. The execution loop is centered
around GraphEngine.run():
engine = GraphEngine(
workflow_id="example-start-llm-output",
graph=graph,
graph_runtime_state=graph_runtime_state,
command_channel=InMemoryChannel(),
)
for event in engine.run():
...
See
examples/graphon_openai_slim/workflow.py
for the full example, including SlimRuntime, SlimPreparedLLM, graph
construction, input seeding, and streamed output handling.
Project Layout
src/graphon/graph: graph structures, parsing, validation, and builderssrc/graphon/graph_engine: orchestration, workers, command channels, and layerssrc/graphon/runtime: runtime state, read-only wrappers, and variable poolsrc/graphon/nodes: built-in workflow node implementationssrc/graphon/model_runtime: provider/model abstractions and Slim runtimesrc/graphon/graph_events: event models emitted during executionsrc/graphon/http: HTTP client abstractions and default implementationsrc/graphon/file: workflow file models and file runtime helperssrc/graphon/protocols: public protocol re-exports for integrationsexamples/: runnable examplestests/: unit and integration-style coverage
Internal Docs
- CONTRIBUTING.md: contributor workflow, CI, commit/PR rules
- examples/graphon_openai_slim/README.md: runnable example setup
- src/graphon/model_runtime/README.md: model runtime overview
- src/graphon/graph_engine/layers/README.md: engine layer extension points
- src/graphon/graph_engine/command_channels/README.md: local and distributed command channels
Development
Contributor setup, tooling details, CLA notes, and commit/PR conventions live in CONTRIBUTING.md.
CI currently validates commit messages, pull request titles, formatting, lint,
and tests on Python 3.12 and 3.13. Python 3.14 is currently excluded because
unstructured does not yet support it.
License
Apache-2.0. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file graphon_local-1.0.11.tar.gz.
File metadata
- Download URL: graphon_local-1.0.11.tar.gz
- Upload date:
- Size: 4.8 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
47aaf3845adfddbde9a7eb01935930d945ec65fe4adc6f9e4f77aaa0e97b5513
|
|
| MD5 |
02bef1db34490d00804ff583f7897405
|
|
| BLAKE2b-256 |
f0d43ca11c489adf6a83a949745263f3db70c9314b7d3d6c59521f78c04416e2
|
File details
Details for the file graphon_local-1.0.11-py3-none-any.whl.
File metadata
- Download URL: graphon_local-1.0.11-py3-none-any.whl
- Upload date:
- Size: 340.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4d0c0007290043e7c2ebaaf0c7f3dd54779bf58cc36f98cb1e3a4efb94cf04a2
|
|
| MD5 |
a532af7b39d3943d19a6384b2de6df9d
|
|
| BLAKE2b-256 |
3e30f7eefaa08a1ffb657fe3a31f5bf93c0ab898b19a0a54c14ff766ed2ad26c
|