Skip to main content

The official Python library for the llama-cloud API

Project description

Llama Cloud Python SDK

PyPI version

The official Python SDK for LlamaParse - the enterprise platform for agentic OCR and document processing.

With this SDK, create powerful workflows across many features:

  • Parse - Agentic OCR and parsing for 130+ formats
  • Extract - Structured data extraction with custom schemas
  • Classify - Document categorization with natural-language rules
  • Agents - Deploy document agents as APIs
  • Index - Document ingestion and embedding for RAG

Documentation

Installation

pip install llama_cloud

Quick Start

import os
from llama_cloud import LlamaCloud

client = LlamaCloud(
    api_key=os.environ.get("LLAMA_CLOUD_API_KEY"),  # This is the default and can be omitted
)

# Parse a document
job = client.parsing.create(
    tier="agentic",
    version="latest",
    file_id="your-file-id",
)

print(job.id)

File Uploads

from pathlib import Path
from llama_cloud import LlamaCloud

client = LlamaCloud()

# Upload using a Path
client.files.create(
    file=Path("/path/to/document.pdf"),
    purpose="parse",
)

# Or using bytes with a tuple of (filename, contents, media_type)
client.files.create(
    file=("document.txt", b"content", "text/plain"),
    purpose="parse",
)

Async Usage

import asyncio
from llama_cloud import AsyncLlamaCloud

client = AsyncLlamaCloud()


async def main():
    job = await client.parsing.create(
        tier="agentic",
        version="latest",
        file_id="your-file-id",
    )
    print(job.id)


asyncio.run(main())

MCP Server

Use the Llama Cloud MCP Server to enable AI assistants to interact with the API:

Add to Cursor Install in VS Code

Error Handling

When the API returns a non-success status code, an APIError subclass is raised:

import llama_cloud
from llama_cloud import LlamaCloud

client = LlamaCloud()

try:
    client.pipelines.list(project_id="my-project-id")
except llama_cloud.APIError as e:
    print(e.status_code)  # 400
    print(e.__class__.__name__)  # BadRequestError
Status Code Error Type
400 BadRequestError
401 AuthenticationError
403 PermissionDeniedError
404 NotFoundError
422 UnprocessableEntityError
429 RateLimitError
>=500 InternalServerError
N/A APIConnectionError

Retries and Timeouts

The SDK automatically retries requests 2 times on connection errors, timeouts, rate limits, and 5xx errors. Requests timeout after 1 minute by default. Functions that combine multiple API calls (e.g. client.parsing.parse()) will have larger timeouts by default to account for the multiple requests and polling.

client = LlamaCloud(
    max_retries=0,  # Disable retries (default: 2)
    timeout=30.0,  # 30 second timeout (default: 1 minute)
)

Pagination

List methods support auto-pagination with for loops:

for run in client.extraction.runs.list(
    extraction_agent_id="agent-id",
    limit=20,
):
    print(run)

Or fetch one page at a time:

page = client.extraction.runs.list(extraction_agent_id="agent-id", limit=20)
for run in page.items:
    print(run)

while page.has_next_page():
    page = page.get_next_page()

Logging

Configure logging via the LLAMA_CLOUD_LOG environment variable or the log option:

client = LlamaCloud(
    log="debug",  # "debug" | "info" | "warn" | "error" | "off"
)

Requirements

  • Python 3.9+

Contributing

See CONTRIBUTING.md.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_cloud-2.4.0.tar.gz (3.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_cloud-2.4.0-py3-none-any.whl (416.4 kB view details)

Uploaded Python 3

File details

Details for the file llama_cloud-2.4.0.tar.gz.

File metadata

  • Download URL: llama_cloud-2.4.0.tar.gz
  • Upload date:
  • Size: 3.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.9

File hashes

Hashes for llama_cloud-2.4.0.tar.gz
Algorithm Hash digest
SHA256 3debd09e75e5b93fe95bd906dd5544044672ff59ffe0261d3416d69f9fe2e7fc
MD5 7e48c3c90dc33a5e2eb5259b855a1a6f
BLAKE2b-256 64eaf38ba7c0d783c95af3f755a4530cfe19b501d8d6f6031719212de187d743

See more details on using hashes here.

File details

Details for the file llama_cloud-2.4.0-py3-none-any.whl.

File metadata

  • Download URL: llama_cloud-2.4.0-py3-none-any.whl
  • Upload date:
  • Size: 416.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.9

File hashes

Hashes for llama_cloud-2.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3738e10a3eeea9ab0ac4a14e00610e99f5fbe82e764722358f8bda2d43fc99ea
MD5 63b5abe17ff8853e4cd0455b9d233425
BLAKE2b-256 bae7e9280d62a4543cedc0537b0647bd32f22d6edcf08571ebc201c77541a206

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page