Skip to main content

Graph execution engine for agentic AI workflows.

Project description

Graphon

Graphon is a Python graph execution engine for agentic AI workflows.

The repository is still evolving, but it already contains a working execution engine, built-in workflow nodes, model runtime abstractions, integration protocols, and a runnable end-to-end example.

Highlights

  • Queue-based GraphEngine orchestration with event-driven execution
  • Graph parsing, validation, and fluent graph building
  • Shared runtime state, variable pool, and workflow execution domain models
  • Built-in node implementations for common workflow patterns
  • Pluggable model runtime interfaces, including a local SlimRuntime
  • HTTP, file, tool, and human-input integration protocols
  • Extensible engine layers and external command channels

Repository modules currently cover node types such as start, end, answer, llm, if-else, code, template-transform, question-classifier, http-request, tool, variable-aggregator, variable-assigner, loop, iteration, parameter-extractor, document-extractor, list-operator, and human-input.

Quick Start

Graphon is currently easiest to evaluate from a source checkout.

Requirements

  • Python 3.12 or 3.13
  • uv
  • make

Python 3.14 is currently unsupported because unstructured, which backs part of the document extraction stack, currently declares Requires-Python: <3.14.

Set up the repository

make dev
source .venv/bin/activate
make test

make dev installs the project, syncs development dependencies, and sets up prek Git hooks.

Run the Example Workflow

The repository includes a minimal runnable example at examples/graphon_openai_slim.

It builds and executes this workflow:

start -> llm -> output

To run it:

make dev
source .venv/bin/activate
cd examples/graphon_openai_slim
cp .env.example .env
python3 workflow.py "Explain Graphon in one short sentence."

Before running the example, fill in the required values in .env.

The example currently expects:

  • an OPENAI_API_KEY
  • a SLIM_PLUGIN_ID
  • a local dify-plugin-daemon-slim setup or equivalent Slim runtime

For the exact environment variables and runtime notes, see examples/graphon_openai_slim/README.md.

How Graphon Fits Together

At a high level, Graphon usage looks like this:

  1. Build or load a graph and instantiate nodes into a Graph.
  2. Prepare GraphRuntimeState and seed the VariablePool.
  3. Configure model, file, HTTP, tool, or human-input adapters as needed.
  4. Run GraphEngine and consume emitted graph events.
  5. Read final outputs from runtime state.

The bundled example follows exactly that path. The execution loop is centered around GraphEngine.run():

engine = GraphEngine(
    workflow_id="example-start-llm-output",
    graph=graph,
    graph_runtime_state=graph_runtime_state,
    command_channel=InMemoryChannel(),
)

for event in engine.run():
    ...

See examples/graphon_openai_slim/workflow.py for the full example, including SlimRuntime, SlimPreparedLLM, graph construction, input seeding, and streamed output handling.

Project Layout

  • src/graphon/graph: graph structures, parsing, validation, and builders
  • src/graphon/graph_engine: orchestration, workers, command channels, and layers
  • src/graphon/runtime: runtime state, read-only wrappers, and variable pool
  • src/graphon/nodes: built-in workflow node implementations
  • src/graphon/model_runtime: provider/model abstractions and Slim runtime
  • src/graphon/graph_events: event models emitted during execution
  • src/graphon/http: HTTP client abstractions and default implementation
  • src/graphon/file: workflow file models and file runtime helpers
  • src/graphon/protocols: public protocol re-exports for integrations
  • examples/: runnable examples
  • tests/: unit and integration-style coverage

Internal Docs

Development

Contributor setup, tooling details, CLA notes, and commit/PR conventions live in CONTRIBUTING.md.

CI currently validates commit messages, pull request titles, formatting, lint, and tests on Python 3.12 and 3.13. Python 3.14 is currently excluded because unstructured does not yet support it.

License

Apache-2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graphon_local-1.0.11.tar.gz (4.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

graphon_local-1.0.11-py3-none-any.whl (340.3 kB view details)

Uploaded Python 3

File details

Details for the file graphon_local-1.0.11.tar.gz.

File metadata

  • Download URL: graphon_local-1.0.11.tar.gz
  • Upload date:
  • Size: 4.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for graphon_local-1.0.11.tar.gz
Algorithm Hash digest
SHA256 47aaf3845adfddbde9a7eb01935930d945ec65fe4adc6f9e4f77aaa0e97b5513
MD5 02bef1db34490d00804ff583f7897405
BLAKE2b-256 f0d43ca11c489adf6a83a949745263f3db70c9314b7d3d6c59521f78c04416e2

See more details on using hashes here.

File details

Details for the file graphon_local-1.0.11-py3-none-any.whl.

File metadata

  • Download URL: graphon_local-1.0.11-py3-none-any.whl
  • Upload date:
  • Size: 340.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for graphon_local-1.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 4d0c0007290043e7c2ebaaf0c7f3dd54779bf58cc36f98cb1e3a4efb94cf04a2
MD5 a532af7b39d3943d19a6384b2de6df9d
BLAKE2b-256 3e30f7eefaa08a1ffb657fe3a31f5bf93c0ab898b19a0a54c14ff766ed2ad26c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page