Skip to main content

An event-driven, async-first, step-based way to control the execution flow of AI applications like Agents.

Project description

LlamaIndex Workflows

LlamaIndex Workflows are a framework for orchestrating and chaining together complex systems of steps and events.

What can you build with Workflows?

Workflows shine when you need to orchestrate complex, multi-step processes that involve AI models, APIs, and decision-making. Here are some examples of what you can build:

  • AI Agents - Create intelligent systems that can reason, make decisions, and take actions across multiple steps
  • Document Processing Pipelines - Build systems that ingest, analyze, summarize, and route documents through various processing stages
  • Multi-Model AI Applications - Coordinate between different AI models (LLMs, vision models, etc.) to solve complex tasks
  • Research Assistants - Develop workflows that can search, analyze, synthesize information, and provide comprehensive answers
  • Content Generation Systems - Create pipelines that generate, review, edit, and publish content with human-in-the-loop approval
  • Customer Support Automation - Build intelligent routing systems that can understand, categorize, and respond to customer inquiries

The async-first, event-driven architecture makes it easy to build workflows that can route between different capabilities, implement parallel processing patterns, loop over complex sequences, and maintain state across multiple steps - all the features you need to make your AI applications production-ready.

Key Features

  • async-first - workflows are built around python's async functionality - steps are async functions that process incoming events from an asyncio queue and emit new events to other queues. This also means that workflows work best in your async apps like FastAPI, Jupyter Notebooks, etc.
  • event-driven - workflows consist of steps and events. Organizing your code around events and steps makes it easier to reason about and test.
  • state management - each run of a workflow is self-contained, meaning you can launch a workflow, save information within it, serialize the state of a workflow and resume it later.
  • observability - workflows are automatically instrumented for observability, meaning you can use tools like Arize Phoenix and OpenTelemetry right out of the box.

Quick Start

Install the package:

pip install llama-index-workflows

And create your first workflow:

import asyncio
from pydantic import BaseModel, Field
from workflows import Context, Workflow, step
from workflows.events import Event, StartEvent, StopEvent

class MyEvent(Event):
    msg: list[str]

class RunState(BaseModel):
    num_runs: int = Field(default=0)

class MyWorkflow(Workflow):
    @step
    async def start(self, ctx: Context[RunState], ev: StartEvent) -> MyEvent:
        async with ctx.store.edit_state() as state:
            state.num_runs += 1

            return MyEvent(msg=[ev.input_msg] * state.num_runs)

    @step
    async def process(self, ctx: Context[RunState], ev: MyEvent) -> StopEvent:
        data_length = len("".join(ev.msg))
        new_msg = f"Processed {len(ev.msg)} times, data length: {data_length}"
        return StopEvent(result=new_msg)

async def main():
    workflow = MyWorkflow()

    # [optional] provide a context object to the workflow
    ctx = Context(workflow)
    result = await workflow.run(input_msg="Hello, world!", ctx=ctx)
    print("Workflow result:", result)

    # re-running with the same context will retain the state
    result = await workflow.run(input_msg="Hello, world!", ctx=ctx)
    print("Workflow result:", result)


if __name__ == "__main__":
    asyncio.run(main())

In the example above

  • Steps that accept a StartEvent will be run first.
  • Steps that return a StopEvent will end the workflow.
  • Intermediate events are user defined and can be used to pass information between steps.
  • The Context object is also used to share information between steps.

Visit the complete documentation for more examples using llama-index!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_workflows-2.15.0rc0.tar.gz (78.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_workflows-2.15.0rc0-py3-none-any.whl (100.0 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_workflows-2.15.0rc0.tar.gz.

File metadata

  • Download URL: llama_index_workflows-2.15.0rc0.tar.gz
  • Upload date:
  • Size: 78.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_workflows-2.15.0rc0.tar.gz
Algorithm Hash digest
SHA256 7dd5ce9e10027b603ed8facf421eecb109c134ffed39c47edfbbada7bd4b6705
MD5 5032d83bc8eb948f7d14cbfcaac5716f
BLAKE2b-256 ba7266121cdcec76627ace4a0492a9258335f2ee46a206e04b7a2ef6563af1af

See more details on using hashes here.

File details

Details for the file llama_index_workflows-2.15.0rc0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_workflows-2.15.0rc0-py3-none-any.whl
  • Upload date:
  • Size: 100.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_workflows-2.15.0rc0-py3-none-any.whl
Algorithm Hash digest
SHA256 0d1092eac0d350daebd2abb5d5aaec0ecd9f6b95288dbc72df3d41869baf08f1
MD5 389af276025f4164fc0f46fd4c80009c
BLAKE2b-256 76ae7cb16d4e3e58cb3c929e9ba188834d4471e2ca0309458ee7bbcc5611f9b8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page