Skip to main content

HTTP client for connecting to and interacting with LlamaIndex workflow servers

Project description

LlamaAgents Client

Async HTTP client for interacting with deployed llama-agents-server instances.

Installation

pip install llama-agents-client

Quick Start

import asyncio
from llama_agents.client import WorkflowClient

async def main():
    client = WorkflowClient(base_url="http://localhost:8080")

    # Run a workflow asynchronously
    handler = await client.run_workflow_nowait("my_workflow")

    # Stream events as they are produced
    async for event in client.get_workflow_events(handler.handler_id):
        print(f"Event: {event.type} -> {event.value}")

    # Get the final result
    result = await client.get_handler(handler.handler_id)
    print(f"Result: {result.result} (status: {result.status})")

asyncio.run(main())

Features

  • Run workflows synchronously or asynchronously
  • Stream events in real-time as a workflow executes
  • Human-in-the-loop support via send_event for injecting events into running workflows
  • Bring your own httpx.AsyncClient for custom auth, headers, or transport

Documentation

See the full deployment guide for detailed usage and API reference.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_agents_client-0.2.0rc1.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_agents_client-0.2.0rc1-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file llama_agents_client-0.2.0rc1.tar.gz.

File metadata

  • Download URL: llama_agents_client-0.2.0rc1.tar.gz
  • Upload date:
  • Size: 8.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.5 {"installer":{"name":"uv","version":"0.10.5","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_agents_client-0.2.0rc1.tar.gz
Algorithm Hash digest
SHA256 766ba4ed03f06df97c4a9fa18d974e24936f9bd84f5073561a8bc5820386ea18
MD5 c2fa9987773b82cc9c6913668e2a851c
BLAKE2b-256 201af2efe43a41387886859588e428785a5a3d5509f8f629e7753214560a7ff2

See more details on using hashes here.

File details

Details for the file llama_agents_client-0.2.0rc1-py3-none-any.whl.

File metadata

  • Download URL: llama_agents_client-0.2.0rc1-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.5 {"installer":{"name":"uv","version":"0.10.5","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_agents_client-0.2.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 72577eb196fe9f1c1e48b41c7c98fa2d504f10e9826fc397fe59bf8f5f8ca5f1
MD5 1b712ebc05167d71e7bcc795c86d3fe3
BLAKE2b-256 a44d951fba9af965d5f40c95374194555d8ed43c052bec9cbc2127d222b5b4a4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page