Skip to main content

HTTP client for connecting to and interacting with LlamaIndex workflow servers

Project description

LlamaAgents Client

Async HTTP client for interacting with deployed llama-agents-server instances.

Installation

pip install llama-agents-client

Quick Start

import asyncio
from llama_agents.client import WorkflowClient

async def main():
    client = WorkflowClient(base_url="http://localhost:8080")

    # Run a workflow asynchronously
    handler = await client.run_workflow_nowait("my_workflow")

    # Stream events as they are produced
    async for event in client.get_workflow_events(handler.handler_id):
        print(f"Event: {event.type} -> {event.value}")

    # Get the final result
    result = await client.get_handler(handler.handler_id)
    print(f"Result: {result.result} (status: {result.status})")

asyncio.run(main())

Features

  • Run workflows synchronously or asynchronously
  • Stream events in real-time as a workflow executes
  • Human-in-the-loop support via send_event for injecting events into running workflows
  • Bring your own httpx.AsyncClient for custom auth, headers, or transport

Documentation

See the full deployment guide for detailed usage and API reference.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_agents_client-0.2.0rc0.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_agents_client-0.2.0rc0-py3-none-any.whl (8.9 kB view details)

Uploaded Python 3

File details

Details for the file llama_agents_client-0.2.0rc0.tar.gz.

File metadata

  • Download URL: llama_agents_client-0.2.0rc0.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_agents_client-0.2.0rc0.tar.gz
Algorithm Hash digest
SHA256 1a6a2f595a0ab16b8de7fc73d68894fa6912f5d11e15db072e69078ed5e0db29
MD5 1827df11c6b576b2722379b2951dce82
BLAKE2b-256 eabca191e911c8db1ec1ae450ac600a80b69ed78f1ddfeab3c629770109dba83

See more details on using hashes here.

File details

Details for the file llama_agents_client-0.2.0rc0-py3-none-any.whl.

File metadata

  • Download URL: llama_agents_client-0.2.0rc0-py3-none-any.whl
  • Upload date:
  • Size: 8.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_agents_client-0.2.0rc0-py3-none-any.whl
Algorithm Hash digest
SHA256 4e881e16473f78d307004645f6e170323e69a1d981cb82763344f223434a7e03
MD5 26954a6712508563de840278c4e275b5
BLAKE2b-256 a0b1ae0b836472af7233302d30ef12d136f12fe3ba815f9361498da6fb28aecb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page