Skip to main content

Declarative middleware ordering via slug-based DAG resolution for LangChain and beyond

Project description

langchain-middleware-stack

langchain-middleware-stack banner

Declarative middleware ordering for LangChain Deep Agents using stable slug-based DAG resolution.

PyPI · License

Docs: Deep Agents middleware · Community issue draft

After you enable GitHub Pages from the /docs folder, add your site URL to docs/index.html (repository link) and to pyproject.toml under [project.urls].

Why

In LangChain Deep Agents, middleware is a first-class control layer over model calls, tools, and state. The framework still composes it as a positional middleware=[...] list on create_agent. In that model, ordering is semantics: the first entry is the outermost wrapper (for example around wrap_model_call), so reordering changes retries, timeouts, logging, and policy in non-obvious ways.

The underlying issue is not middleware itself — it is that composition is positional instead of declarative. That breaks down in production because:

  • Fragility — Inserting or reordering entries changes behavior; constraints like “after logging, before retry” are not declared on the middleware type.
  • Poor composability — Separate teams or packages cannot merge contributions without one owner of the final list and implicit coordination.
  • Hidden coupling — Dependencies are expressed as indices, not as explicit, reviewable constraints.
  • No validated guarantees — Ordering invariants and dependency relationships are not enforced before runtime.

**langchain-middleware-stack** addresses this with constraint-based composition (DAG + topological sort, stable Kahn tie-break). You declare intent with four primitives:

Primitive Role
slug Stable, unique identity for each middleware
after / before Self-declared ordering constraints
MiddlewareStack Topological resolver (Kahn's algorithm, stable tie-break)
wires Cross-middleware attribute injection at resolve-time

Installation

pip install langchain-middleware-stack

Zero runtime dependencies. Python ≥ 3.9.

Demo notebook

[notebooks/deep-agents-middleware.ipynb](notebooks/deep-agents-middleware.ipynb) walks through baseline vs improved, both using a real [ChatOpenAI](https://python.langchain.com/docs/integrations/chat/openai/) model (OPENAI_API_KEY required for those cells):

Baseline [create_agent](https://reference.langchain.com/python/langchain/agents/create_agent) with a manually ordered middleware=[...] list.
Improved The same middleware types added to a MiddlewareStack in scrambled order; resolve() produces the LangChain list (outermost first), then create_agent uses that list.

Appendices at the end: an offline wrap(handler) toy stack and optional LangChain FakeListChatModel — not the main teaching path.

make notebook   # from a dev setup with `make setup`

Quick start

from langchain_middleware_stack import MiddlewareStack
from langchain_middleware_stack.middleware import LoggingMiddleware, RetryMiddleware

stack = MiddlewareStack()
stack.add([RetryMiddleware(max_retries=3), LoggingMiddleware()])
# or: stack.add(RetryMiddleware(...)).add(LoggingMiddleware())
ordered = stack.resolve()
# -> [LoggingMiddleware, RetryMiddleware]
# retry.after=("logging",) reordered — every retry is logged

Writing your own middleware

from typing import ClassVar
from langchain_middleware_stack import BaseMiddleware

class TracingMiddleware(BaseMiddleware):
    slug: ClassVar[str] = "tracing"
    after: ClassVar[tuple[str, ...]] = ("logging",)  # run after logging

    def wrap(self, handler, *args, **kwargs):
        with tracer.start_span(handler.__name__):
            return handler(*args, **kwargs)

    async def awrap(self, handler, *args, **kwargs):
        with tracer.start_span(handler.__name__):
            return await handler(*args, **kwargs)

BaseMiddleware is a mixin — use it with LangChain's AgentMiddleware when you pass middleware into create_agent. Subclasses must implement the agent hooks you need (typically wrap_model_call); declare tools (often ()) on the class.

from typing import ClassVar

from langchain.agents.middleware import AgentMiddleware
from langchain_middleware_stack import BaseMiddleware

class MyMiddleware(AgentMiddleware, BaseMiddleware):
    slug: ClassVar[str] = "my-middleware"
    tools: ClassVar[tuple] = ()

    def wrap_model_call(self, request, handler):
        # intercept model calls; delegate with handler(request)
        return handler(request)

The notebook uses this pattern end-to-end for the baseline and improved scenarios.

Cross-middleware wiring

wires injects an attribute from a resolved upstream middleware into your middleware at stack build time:

class ConsumerMiddleware(BaseMiddleware):
    slug: ClassVar[str] = "consumer"
    after: ClassVar[tuple[str, ...]] = ("provider",)
    wires: ClassVar[dict[str, tuple[str, str]]] = {
        "_shared_fn": ("provider", "exported_fn")
    }
    # _shared_fn is injected from provider.exported_fn after resolve()

Error reference

Exception Raised when
MiddlewareResolutionError Base class for all stack build errors
MiddlewareCycleError Dependency graph contains a cycle
MiddlewareDuplicateSlugError Two middleware share the same slug
MiddlewareWiringError Cross-middleware wiring fails
RetryExhaustedError RetryMiddleware runs out of attempts

LangChain upstream and community PRs

Relevant work in langchain-ai/langchain (not an exhaustive list):

PR Status Topic
#32828 merged AgentMiddleware and middleware= on create_agent
#34514 open declarative depends-on between middleware and topological ordering

This package is a standalone resolver you can use with the current harness; how much overlaps with #34514 if it merges is an integration detail for later. A maintainer-facing draft lives in docs/github-issue-langchain-community.md (fill before opening a tracking issue).

License

Apache-2.0

Author

João Gabriel LimaLinkedIn · joaogabriellima.eng@gmail.com · jambu.ai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_middleware_stack-0.1.0.tar.gz (1.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_middleware_stack-0.1.0-py3-none-any.whl (19.2 kB view details)

Uploaded Python 3

File details

Details for the file langchain_middleware_stack-0.1.0.tar.gz.

File metadata

File hashes

Hashes for langchain_middleware_stack-0.1.0.tar.gz
Algorithm Hash digest
SHA256 8cbbe1759407da671fd09a40746cd02c8a823c141b3f4f9e72d0dd8ab73e3619
MD5 72024ba92496c359483dce2824d3b6d5
BLAKE2b-256 d191c35440de5f5b73ad3ca066b542c14a55cfc336e34ee415606df8aa4ccdda

See more details on using hashes here.

File details

Details for the file langchain_middleware_stack-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_middleware_stack-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a1a6856118f8d28a56526d218e4355365393258191f00eb2a2ccbdaf8e6bbbab
MD5 0bc3a334418522c678f4d3a31c3b01fa
BLAKE2b-256 3624ea8ea3a23c5c14a21482f6d9eb484977f5d6681db1cd529bbeb338fd727e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page