Skip to main content

Flowra — flow infrastructure for building stateful LLM agents

Project description

Flowra

PyPI Python License CI

Flow infra for building stateful, persistent LLM agents with tool use, parallel execution, and crash recovery. Requires Python 3.12+.

Features

  • State machine agents — define agents as Agent[Spec, Result] classes with @step methods, a single entry point, and typed spec/result contracts
  • Persistent stateScalar[T] and AppendOnlyList[T] with incremental dirty-tracking and pluggable storage (in-memory, file-based, or custom)
  • Tool integration@tool decorator for local functions, MCP server support, DI into tool handlers, agents as tools for LLM-driven delegation
  • LLM abstraction — provider-agnostic LLMProvider interface with immutable message types and real-time streaming (ships AnthropicVertexProvider, GoogleVertexProvider, OpenAIProvider)
  • Agents as tools@agent_tool decorator exposes an agent as a tool the LLM can call autonomously; sub-agent runs its own system prompt and tool loop
  • Cooperative interruptsInterruptToken for graceful cancellation across the entire execution tree
  • Pre-built agentsChatAgent (multi-turn chat with session history) and ToolLoopAgent (single-turn LLM tool loop with hooks and caching)

Installation

# Base package (no LLM providers)
pip install flowra

# With specific providers
pip install flowra[anthropic]
pip install flowra[openai]
pip install flowra[google]

# All providers
pip install flowra[all]

Quick start

import asyncio

from flowra.agent import AgentRuntime
from flowra.lib import LLMConfig
from flowra.lib.chat import ChatAgent, ChatConfig, ChatResult, ChatSpec
from flowra.llm import LLMProvider, SystemMessage, TextBlock
from flowra.llm.providers.anthropic_vertex import AnthropicVertexProvider
from flowra.tools import ToolRegistry


async def main() -> None:
    async with (
        AnthropicVertexProvider() as provider,
        await ToolRegistry.create([]) as registry,
    ):
        config = ChatConfig(
            llm_config=LLMConfig(model="claude-sonnet-4-5@20250929"),
            system=[SystemMessage(blocks=[TextBlock(text="You are a helpful assistant.")])],
        )

        runtime = AgentRuntime(
            agents={"chat": ChatAgent},
            services={LLMProvider: provider, ToolRegistry: registry, ChatConfig: config},
        )

        while True:
            user_input = input("You: ")
            if not user_input:
                break

            result = await runtime.run(agent=ChatAgent, spec=ChatSpec(user_message=user_input))

            if isinstance(result, ChatResult) and result.response:
                print(f"Assistant: {result.response}")


asyncio.run(main())

Package structure

flowra/
├── llm/        # LLM abstraction (messages, blocks, provider interface)
├── tools/      # Tool definition, registration, execution
├── agent/      # Agent framework + execution engine + persistence
└── lib/        # Pre-built agents (ChatAgent, ToolLoopAgent, hooks, caching)

Documentation

  • Getting Started — from installation to a working chatbot with tools in 5 minutes
  • Working with LLMs — providers, streaming, structured output, caching, extended thinking
  • Tools — tool groups, MCP servers, service injection
  • Agents — custom agents, state machines, control flow, parallel execution
  • Patterns — multi-agent patterns: router, pipeline, race, fan-out
  • Observability — hooks, spans, MLflow and OTel integrations

Development

make deps      # install dependencies (uv sync)
make check     # lint + test
make chat      # run interactive console chat example

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowra-0.0.24.tar.gz (459.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

flowra-0.0.24-py3-none-any.whl (114.3 kB view details)

Uploaded Python 3

File details

Details for the file flowra-0.0.24.tar.gz.

File metadata

  • Download URL: flowra-0.0.24.tar.gz
  • Upload date:
  • Size: 459.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.0 {"installer":{"name":"uv","version":"0.11.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for flowra-0.0.24.tar.gz
Algorithm Hash digest
SHA256 03b26cf40980970960e7302d1c41d008a727b8e3f7a5111eeb98f4515cf3888c
MD5 d37e77e6873dfd1fc6e2812da7aaeaa4
BLAKE2b-256 aced40f76eb9573f43737467a1171b2893aa4c383901f6ef56197ae4baf1c62f

See more details on using hashes here.

File details

Details for the file flowra-0.0.24-py3-none-any.whl.

File metadata

  • Download URL: flowra-0.0.24-py3-none-any.whl
  • Upload date:
  • Size: 114.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.0 {"installer":{"name":"uv","version":"0.11.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for flowra-0.0.24-py3-none-any.whl
Algorithm Hash digest
SHA256 9d82d82195cee42fa573a4b5c0c212d943d33a8b07731a88b80d0368a3825781
MD5 7afcdab009d08911100966479ae05f77
BLAKE2b-256 b6591fd08cfa37e52874527b4dbdff9f74138994861d5988138306cce52920d3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page