HTTP server for deploying and serving LlamaIndex workflows as web services
Project description
LlamaAgents Server
HTTP server for deploying LlamaIndex Workflows as web services. Built on Starlette and Uvicorn.
Installation
pip install llama-agents-server
Quick Start
Create a server file (e.g., my_server.py):
import asyncio
from workflows import Workflow, step
from workflows.context import Context
from workflows.events import Event, StartEvent, StopEvent
from llama_agents.server import WorkflowServer
class StreamEvent(Event):
sequence: int
class GreetingWorkflow(Workflow):
@step
async def greet(self, ctx: Context, ev: StartEvent) -> StopEvent:
for i in range(3):
ctx.write_event_to_stream(StreamEvent(sequence=i))
name = ev.get("name", "World")
return StopEvent(result=f"Hello, {name}!")
server = WorkflowServer()
server.add_workflow("greet", GreetingWorkflow())
if __name__ == "__main__":
asyncio.run(server.serve("0.0.0.0", 8080))
Or run it with the CLI:
llama-agents-server my_server.py
Features
- REST API for running, streaming, and managing workflows
- Debugger UI automatically mounted at
/for visualizing and debugging workflows - Event streaming via newline-delimited JSON or Server-Sent Events
- Human-in-the-loop support for interactive workflows
- Persistence with built-in SQLite store (or bring your own via
AbstractWorkflowStore)
Client
Use llama-agents-client to interact with deployed servers programmatically.
Documentation
See the full deployment guide for API details, persistence configuration, and more.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llama_agents_server-0.2.2.tar.gz.
File metadata
- Download URL: llama_agents_server-0.2.2.tar.gz
- Upload date:
- Size: 45.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f99c5509362a84f7e963ae2a8fc7d19ec10f63d326893c75f0041e8643feb666
|
|
| MD5 |
1de03fb3a602baf92d6e19ff9924c655
|
|
| BLAKE2b-256 |
955ece28708c5b81e35e767ca3de303e158d815098aab85afe0fd0fcf230cc6e
|
File details
Details for the file llama_agents_server-0.2.2-py3-none-any.whl.
File metadata
- Download URL: llama_agents_server-0.2.2-py3-none-any.whl
- Upload date:
- Size: 65.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b7933541d92805dd076e5d89a5297b23d4b5d93fe29786e79fcaf95fc545fe7d
|
|
| MD5 |
e4bad48dd522a6deca90a82495e002eb
|
|
| BLAKE2b-256 |
2d6a6ddef73a404d85807ab19e98cd3e522ec52a967cf0964878af8b38019552
|