Skip to main content

Flowra — flow infrastructure for building stateful LLM agents

Project description

Flowra

PyPI Python License CI

Flow infra for building stateful, persistent LLM agents with tool use, parallel execution, and crash recovery. Requires Python 3.12+.

Features

  • State machine agents — define agents as Agent[Spec, Result] classes with @step methods, a single entry point, and typed spec/result contracts
  • Persistent stateScalar[T] and AppendOnlyList[T] with incremental dirty-tracking and pluggable storage (in-memory, file-based, or custom)
  • Tool integration@tool decorator for local functions, MCP server support, DI into tool handlers, agents as tools for LLM-driven delegation
  • LLM abstraction — provider-agnostic LLMProvider interface with immutable message types and real-time streaming (ships AnthropicVertexProvider, GoogleVertexProvider, OpenAIProvider, OpenAIResponsesProvider)
  • Agents as tools@agent_tool decorator exposes an agent as a tool the LLM can call autonomously; sub-agent runs its own system prompt and tool loop
  • Cooperative interruptsInterruptToken for graceful cancellation across the entire execution tree
  • Pre-built agentsChatAgent (multi-turn chat with session history) and ToolLoopAgent (single-turn LLM tool loop with hooks and caching)

Installation

# Base package (no LLM providers)
pip install flowra

# With specific providers
pip install flowra[anthropic]
pip install flowra[openai]
pip install flowra[google]

# All providers
pip install flowra[all]

Quick start

import asyncio

from flowra.agent import AgentRuntime
from flowra.lib import LLMConfig
from flowra.lib.chat import ChatAgent, ChatConfig, ChatResult, ChatSpec
from flowra.llm import LLMProvider, SystemMessage, TextBlock
from flowra.llm.providers.anthropic_vertex import AnthropicVertexProvider
from flowra.tools import ToolRegistry


async def main() -> None:
    async with (
        AnthropicVertexProvider() as provider,
        await ToolRegistry.create([]) as registry,
    ):
        config = ChatConfig(
            llm_config=LLMConfig(model="claude-sonnet-4-5@20250929"),
            system=[SystemMessage(blocks=[TextBlock(text="You are a helpful assistant.")])],
        )

        runtime = AgentRuntime(
            agents={"chat": ChatAgent},
            services={LLMProvider: provider, ToolRegistry: registry, ChatConfig: config},
        )

        while True:
            user_input = input("You: ")
            if not user_input:
                break

            result = await runtime.run(agent=ChatAgent, spec=ChatSpec(user_message=user_input))

            if isinstance(result, ChatResult) and result.response:
                print(f"Assistant: {result.response}")


asyncio.run(main())

Package structure

flowra/
├── llm/        # LLM abstraction (messages, blocks, provider interface)
├── tools/      # Tool definition, registration, execution
├── agent/      # Agent framework + execution engine + persistence
└── lib/        # Pre-built agents (ChatAgent, ToolLoopAgent, hooks, caching)

Documentation

  • Getting Started — from installation to a working chatbot with tools in 5 minutes
  • Working with LLMs — providers, streaming, structured output, caching, extended thinking
  • Tools — tool groups, MCP servers, service injection
  • Agents — custom agents, state machines, control flow, parallel execution
  • Patterns — multi-agent patterns: router, pipeline, race, fan-out
  • Observability — hooks, spans, MLflow and OTel integrations

Development

make deps      # install dependencies (uv sync)
make check     # lint + test
make chat      # run interactive console chat example

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowra-0.0.36.tar.gz (426.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

flowra-0.0.36-py3-none-any.whl (124.5 kB view details)

Uploaded Python 3

File details

Details for the file flowra-0.0.36.tar.gz.

File metadata

  • Download URL: flowra-0.0.36.tar.gz
  • Upload date:
  • Size: 426.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for flowra-0.0.36.tar.gz
Algorithm Hash digest
SHA256 15a4279cacaecf1008c595fecf5a84f8d22de4c7340d818407ab85bf9e09969a
MD5 99af5a08678e4ef441b13f011db52658
BLAKE2b-256 a625f0f6a65baef89874cdf54a1ba758ac8ff0d34bb63579bdc16b4b497e3502

See more details on using hashes here.

File details

Details for the file flowra-0.0.36-py3-none-any.whl.

File metadata

  • Download URL: flowra-0.0.36-py3-none-any.whl
  • Upload date:
  • Size: 124.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for flowra-0.0.36-py3-none-any.whl
Algorithm Hash digest
SHA256 d912809692e39de137365bae426a243113de2fee2ceec9aa605eda1de3c79338
MD5 aea295b32596f0223ea2fbc0fd76e4d4
BLAKE2b-256 1d94fbf79da12e5396d864090b0d8fa0e85ebd71ae71e7d6e308a2bafea6e47e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page