Skip to main content

HTTP server for deploying and serving LlamaIndex workflows as web services

Project description

LlamaAgents Server

HTTP server for deploying LlamaIndex Workflows as web services. Built on Starlette and Uvicorn.

Installation

pip install llama-agents-server

Quick Start

Create a server file (e.g., my_server.py):

import asyncio
from workflows import Workflow, step
from workflows.context import Context
from workflows.events import Event, StartEvent, StopEvent
from llama_agents.server import WorkflowServer

class StreamEvent(Event):
    sequence: int

class GreetingWorkflow(Workflow):
    @step
    async def greet(self, ctx: Context, ev: StartEvent) -> StopEvent:
        for i in range(3):
            ctx.write_event_to_stream(StreamEvent(sequence=i))
        name = ev.get("name", "World")
        return StopEvent(result=f"Hello, {name}!")

server = WorkflowServer()
server.add_workflow("greet", GreetingWorkflow())

if __name__ == "__main__":
    asyncio.run(server.serve("0.0.0.0", 8080))

Or run it with the CLI:

llama-agents-server my_server.py

Features

  • REST API for running, streaming, and managing workflows
  • Debugger UI automatically mounted at / for visualizing and debugging workflows
  • Event streaming via newline-delimited JSON or Server-Sent Events
  • Human-in-the-loop support for interactive workflows
  • Persistence with built-in SQLite store (or bring your own via AbstractWorkflowStore)

Client

Use llama-agents-client to interact with deployed servers programmatically.

Documentation

See the full deployment guide for API details, persistence configuration, and more.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_agents_server-0.2.0rc1.tar.gz (37.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_agents_server-0.2.0rc1-py3-none-any.whl (54.7 kB view details)

Uploaded Python 3

File details

Details for the file llama_agents_server-0.2.0rc1.tar.gz.

File metadata

  • Download URL: llama_agents_server-0.2.0rc1.tar.gz
  • Upload date:
  • Size: 37.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_agents_server-0.2.0rc1.tar.gz
Algorithm Hash digest
SHA256 2bc35ca9b1793a4296a06e9a1c9634d5cd3b2344e24c34ac11c8524c13bc9938
MD5 cfdb725e2f7ab13d1fa59c162a6b7ca7
BLAKE2b-256 755a13192d3480b302c41e4d684ab9041d08e92cb667cf886baf1d5f54fa56fe

See more details on using hashes here.

File details

Details for the file llama_agents_server-0.2.0rc1-py3-none-any.whl.

File metadata

  • Download URL: llama_agents_server-0.2.0rc1-py3-none-any.whl
  • Upload date:
  • Size: 54.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_agents_server-0.2.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 6473641e9d36436449b34ac738bad1d63a0b20bae3a8bd5a4d6f942bb4a09551
MD5 cdf7717c6b814d778346188aef6216ba
BLAKE2b-256 fedae5ba4d228e60f01a2163b2bc85bd6e909ca00f100fc79cfa72e11f2ab78e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page