Write Python. Get a production LLM graph. Declarative LLM graph compiler that compiles @node-decorated functions into LangGraph.
Project description
NeoGraph
Write Python. Get a production graph.
Docs & guides: neograph.pro — full documentation site with tutorials, API reference, and side-by-side LangGraph comparisons.
# uv (recommended)
uv add neograph
# pip
pip install neograph
Define your LLM pipeline as Python functions. The framework infers the topology, validates types at assembly time, and compiles to LangGraph with checkpointing, observability, and tool orchestration. No DSL. No YAML. No add_node / add_edge.
A function is a node. A parameter name is an edge. An if is a branch.
Functions are nodes
from neograph import node, construct_from_module, compile, run
@node(output=Claims, prompt='rw/decompose', model='reason')
def decompose(topic: RawText) -> Claims: ...
@node(output=Classified, prompt='rw/classify', model='fast')
def classify(decompose: Claims) -> Classified: ...
@node(output=Report)
def report(classify: Classified) -> Report:
return Report(summary=f"{len(classify.items)} claims processed")
pipeline = construct_from_module(sys.modules[__name__])
graph = compile(pipeline)
result = run(graph, input={'node_id': 'doc-001'})
classify(decompose: Claims) — the parameter name IS the dependency. Rename a function, downstream breaks at import time. Fan-in is just more parameters: def report(claims, scores, verified).
Mode is inferred. prompt= + model= means LLM call. Neither means the function body runs.
if is a branch
from neograph import ForwardConstruct, Node, compile
class Analysis(ForwardConstruct):
check = Node(output=CheckResult, prompt='check', model='fast')
deep = Node(output=Result, prompt='deep-analysis', model='reason')
shallow = Node(output=Result, prompt='quick-scan', model='fast')
def forward(self, topic):
checked = self.check(topic)
if checked.confidence > 0.8:
return self.shallow(checked)
else:
return self.deep(checked)
graph = compile(Analysis())
The if compiles to a conditional edge. for compiles to fan-out. Python is the graph language. Your type checker sees everything. Your debugger works.
Everything else is a keyword
# Fan-out over a collection
@node(output=MatchResult, map_over='clusters.groups', map_key='label')
def verify(cluster: ClusterGroup) -> MatchResult: ...
# N-way ensemble with merge
@node(output=Claims, prompt='decompose', model='reason',
ensemble_n=3, merge_fn='merge_claims')
def decompose() -> Claims: ...
# Human-in-the-loop interrupt
@node(output=ValidationResult,
interrupt_when=lambda s: {'issues': s.validate.issues} if not s.validate.passed else None)
def validate(claims: Claims) -> ValidationResult: ...
# Non-node parameters: runtime input, config, constants
from typing import Annotated
from neograph import FromInput, FromConfig
@node(output=Report)
def summarize(
claims: Claims, # upstream node
topic: Annotated[str, FromInput], # from run(input={...})
rate_limiter: Annotated[RateLimiter, FromConfig], # from config
max_items: int = 10, # constant
) -> Report: ...
Catches mistakes before you run
ConstructError: Node 'verify' declares inputs=ClusterGroup but no upstream
produces a compatible value.
upstream producers:
• node 'cluster': Clusters
hint: did you forget to fan out? try .map(lambda s: s.cluster.groups, key='...')
at my_pipeline.py:42
Types are validated at assembly time — when you define the pipeline, not when you execute it.
Scales to real systems
Organize by module. Each pipeline is a Python module. Import nodes across modules. construct_from_module finds them all.
Isolate with sub-constructs. Typed I/O boundaries for sub-pipelines: Construct("enrich", input=Claims, output=ScoredClaims, nodes=[...]).
Observe everything. Structured logs on every node. Pass trace providers and shared resources via Annotated[T, FromConfig].
Test at every level. node.run_isolated() for unit tests. compile() + run() for integration. forward() direct-call for debugging.
LLMs can build the graph too
For runtime construction — an LLM emitting a pipeline via tool calls, a config system defining workflows — use the programmatic API with the | pipe syntax:
from neograph import Node, Construct, Oracle, Each, compile, run
decompose = Node("decompose", mode="produce", output=Claims,
prompt="rw/decompose", model="reason") | Oracle(n=3, merge_fn="merge")
verify = Node("verify", mode="gather", output=MatchResult,
prompt="verify", model="fast") | Each(over="decompose.items", key="label")
pipeline = Construct("dynamic", nodes=[decompose, verify])
graph = compile(pipeline)
Three surfaces — @node, ForwardConstruct, Node | Modifier — one compiler.
Documentation
Full documentation is at neograph.pro:
- Quick Start — install, configure, build a pipeline, run it
- The @node API — functions as nodes, modifier kwargs, FromInput/FromConfig, organizing pipelines
- ForwardConstruct — class-based pipelines with Python
if/for/try - Runtime Construction — LLM-driven pipeline assembly, programmatic API
- vs LangGraph — side-by-side for five common patterns
- API Reference
Examples
See examples/ for runnable pipelines and examples/vs_langgraph/ for side-by-side comparisons. Each example is narrated on neograph.pro as a walkthrough.
License
Code: MIT
Documentation content © 2025-2026 Constantine Mirin, mirin.pro. Licensed under CC BY-ND 4.0.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file neograph-0.2.0.tar.gz.
File metadata
- Download URL: neograph-0.2.0.tar.gz
- Upload date:
- Size: 66.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ae8d39fff8bf395b6e396c1524cb8c62f6ca45fa5cd6a7259c801cab8436f506
|
|
| MD5 |
638801e57a953333a0e56e0cffc19813
|
|
| BLAKE2b-256 |
4edcf0eb1181c84ae4f7d7bb2348e798edcc1c7c925383f462798686c5f02563
|
Provenance
The following attestation bundles were made for neograph-0.2.0.tar.gz:
Publisher:
publish.yml on KonstantinMirin/neograph
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
neograph-0.2.0.tar.gz -
Subject digest:
ae8d39fff8bf395b6e396c1524cb8c62f6ca45fa5cd6a7259c801cab8436f506 - Sigstore transparency entry: 1245342720
- Sigstore integration time:
-
Permalink:
KonstantinMirin/neograph@b4adfa3cf6f3b01e14a61faf2f2dd056b56127f7 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/KonstantinMirin
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b4adfa3cf6f3b01e14a61faf2f2dd056b56127f7 -
Trigger Event:
push
-
Statement type:
File details
Details for the file neograph-0.2.0-py3-none-any.whl.
File metadata
- Download URL: neograph-0.2.0-py3-none-any.whl
- Upload date:
- Size: 71.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7ca8a742c3bf65ef8525ad4a9e4253739a9055be79cdbdf8becd4ae5e4101bd2
|
|
| MD5 |
fc62c9172a79efa605c14cb2f43116c4
|
|
| BLAKE2b-256 |
a197f7f86cf48cd79e575d8855f376763b1b49444b7dbc2ad52c215f6879b516
|
Provenance
The following attestation bundles were made for neograph-0.2.0-py3-none-any.whl:
Publisher:
publish.yml on KonstantinMirin/neograph
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
neograph-0.2.0-py3-none-any.whl -
Subject digest:
7ca8a742c3bf65ef8525ad4a9e4253739a9055be79cdbdf8becd4ae5e4101bd2 - Sigstore transparency entry: 1245342721
- Sigstore integration time:
-
Permalink:
KonstantinMirin/neograph@b4adfa3cf6f3b01e14a61faf2f2dd056b56127f7 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/KonstantinMirin
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b4adfa3cf6f3b01e14a61faf2f2dd056b56127f7 -
Trigger Event:
push
-
Statement type: