Skip to main content

A respx-backed fake of the LlamaCloud v2 API for use in tests

Project description

llama-cloud-fake

A respx-backed fake of the LlamaCloud v2 HTTP API for tests.

Enter FakeLlamaCloudServer as a context manager and it intercepts outbound calls to api.cloud.llamaindex.ai, serving deterministic canned responses for the namespaces your workflow actually exercises (files, extract, parse, pipelines, classify, agent_data, configurations, sheets, split). The real llama_cloud client drives the fake unchanged — point it at the fake by env var and run your workflow end-to-end without hitting the real API.

Install

pip install llama-cloud-fake

Test-time dependency — add it to your dev group (e.g. [dependency-groups].dev in pyproject.toml).

Quick start

import pytest
from llama_cloud import AsyncLlamaCloud
from llama_cloud_fake import FakeLlamaCloudServer
from pydantic import BaseModel

class Receipt(BaseModel):
    merchant: str
    total: float

@pytest.fixture(autouse=True)
def _fake_env(monkeypatch):
    monkeypatch.setenv("LLAMA_CLOUD_API_KEY", "unit-test-key")
    monkeypatch.setenv("LLAMA_CLOUD_BASE_URL", FakeLlamaCloudServer.DEFAULT_BASE_URL)

@pytest.mark.asyncio
async def test_extract(tmp_path):
    sample = tmp_path / "receipt.txt"
    sample.write_text("Merchant: Lunar Bistro\nTotal: 123.45")

    with FakeLlamaCloudServer():
        client = AsyncLlamaCloud(api_key="unit-test-key")
        file = await client.files.create(file=sample, purpose="extract")
        job = await client.extract.run(
            file_input=file.id,
            configuration={
                "data_schema": Receipt.model_json_schema(),
                "tier": "cost_effective",
            },
        )
        assert job.status == "COMPLETED"
        assert "merchant" in job.extract_result

Responses are derived deterministically from the request (file hash, schema hash), so repeated calls return the same payloads. Unmatched requests fail respx assertions.

Overriding responses

Each namespace exposes stub_* methods to inject a canned result for specific requests. Later stubs win; they fall through to the deterministic default if no matcher fits.

from llama_cloud_fake import (
    FakeLlamaCloudServer,
    FileMatcher,
    RequestMatcher,
    SchemaMatcher,
)

with FakeLlamaCloudServer() as fake:
    fake.extract.stub_run(
        RequestMatcher(
            file=FileMatcher(filename="receipt.txt"),
            schema=SchemaMatcher(model=Receipt),
        ),
        data={"merchant": "Lunar Bistro", "total": 123.45},
        status="COMPLETED",
    )
    # ...run your workflow; the extract.run call above now returns the stub.

RequestMatcher also accepts agent_id, project_id, organization_id, and a custom predicate: Callable[[httpx.Request], bool].

Preloading files

Skip the client.files.create round-trip in tests that just need a file id:

with FakeLlamaCloudServer() as fake:
    file_id = fake.files.preload(path="tests/fixtures/invoice.pdf")
    # pass file_id straight into extract / parse / classify calls

Source

The source for this package isn't hosted in a public repository. It's MIT-licensed and the wheel on PyPI ships the full source — unpack it and copy whatever you find useful.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_cloud_fake-0.1.0.tar.gz (21.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_cloud_fake-0.1.0-py3-none-any.whl (29.5 kB view details)

Uploaded Python 3

File details

Details for the file llama_cloud_fake-0.1.0.tar.gz.

File metadata

  • Download URL: llama_cloud_fake-0.1.0.tar.gz
  • Upload date:
  • Size: 21.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_cloud_fake-0.1.0.tar.gz
Algorithm Hash digest
SHA256 5cd7725d8838aaa71ac25c2950c2e5efaed45a420a5dc76e03e28b96f0e5e82f
MD5 7042c0712e5ac1f4cd05407625de9095
BLAKE2b-256 104e1ec43b3e400d009149f5cadcffb60083a6a8809d6987eeadfd1b77395f91

See more details on using hashes here.

File details

Details for the file llama_cloud_fake-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: llama_cloud_fake-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 29.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_cloud_fake-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 90458d618904b1381d184f31d32cb2b92d20bf62fb988b0985c40eff673502b3
MD5 565bb886a64c5e654ee966db5910b661
BLAKE2b-256 3682ffabae1d0981b4b8114f35eee12d1ebf5ed3a8b394554a2c5cb39116ca0d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page