Skip to main content

The official Python library for the llama-cloud API

Project description

Llama Cloud Python SDK

PyPI version

The official Python SDK for LlamaParse - the enterprise platform for agentic OCR and document processing.

With this SDK, create powerful workflows across many features:

  • Parse - Agentic OCR and parsing for 130+ formats
  • Extract - Structured data extraction with custom schemas
  • Classify - Document categorization with natural-language rules
  • Agents - Deploy document agents as APIs
  • Index - Document ingestion and embedding for RAG

Documentation

Installation

pip install llama_cloud

Quick Start

import os
from llama_cloud import LlamaCloud

client = LlamaCloud(
    api_key=os.environ.get("LLAMA_CLOUD_API_KEY"),  # This is the default and can be omitted
)

# Parse a document
job = client.parsing.create(
    tier="agentic",
    version="latest",
    file_id="your-file-id",
)

print(job.id)

File Uploads

from pathlib import Path
from llama_cloud import LlamaCloud

client = LlamaCloud()

# Upload using a Path
client.files.create(
    file=Path("/path/to/document.pdf"),
    purpose="parse",
)

# Or using bytes with a tuple of (filename, contents, media_type)
client.files.create(
    file=("document.txt", b"content", "text/plain"),
    purpose="parse",
)

Async Usage

import asyncio
from llama_cloud import AsyncLlamaCloud

client = AsyncLlamaCloud()


async def main():
    job = await client.parsing.create(
        tier="agentic",
        version="latest",
        file_id="your-file-id",
    )
    print(job.id)


asyncio.run(main())

MCP Server

Use the Llama Cloud MCP Server to enable AI assistants to interact with the API:

Add to Cursor Install in VS Code

Error Handling

When the API returns a non-success status code, an APIError subclass is raised:

import llama_cloud
from llama_cloud import LlamaCloud

client = LlamaCloud()

try:
    client.pipelines.list(project_id="my-project-id")
except llama_cloud.APIError as e:
    print(e.status_code)  # 400
    print(e.__class__.__name__)  # BadRequestError
Status Code Error Type
400 BadRequestError
401 AuthenticationError
403 PermissionDeniedError
404 NotFoundError
422 UnprocessableEntityError
429 RateLimitError
>=500 InternalServerError
N/A APIConnectionError

Retries and Timeouts

The SDK automatically retries requests 2 times on connection errors, timeouts, rate limits, and 5xx errors. Requests timeout after 1 minute by default. Functions that combine multiple API calls (e.g. client.parsing.parse()) will have larger timeouts by default to account for the multiple requests and polling.

client = LlamaCloud(
    max_retries=0,  # Disable retries (default: 2)
    timeout=30.0,  # 30 second timeout (default: 1 minute)
)

Pagination

List methods support auto-pagination with for loops:

for run in client.extraction.runs.list(
    extraction_agent_id="agent-id",
    limit=20,
):
    print(run)

Or fetch one page at a time:

page = client.extraction.runs.list(extraction_agent_id="agent-id", limit=20)
for run in page.items:
    print(run)

while page.has_next_page():
    page = page.get_next_page()

Logging

Configure logging via the LLAMA_CLOUD_LOG environment variable or the log option:

client = LlamaCloud(
    log="debug",  # "debug" | "info" | "warn" | "error" | "off"
)

Requirements

  • Python 3.9+

Contributing

See CONTRIBUTING.md.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_cloud-2.4.1.tar.gz (3.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_cloud-2.4.1-py3-none-any.whl (417.2 kB view details)

Uploaded Python 3

File details

Details for the file llama_cloud-2.4.1.tar.gz.

File metadata

  • Download URL: llama_cloud-2.4.1.tar.gz
  • Upload date:
  • Size: 3.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.9

File hashes

Hashes for llama_cloud-2.4.1.tar.gz
Algorithm Hash digest
SHA256 bfb09ccbaa4dab7df1829d23aeda39d638a2d36fa6f70fe79d5e82d970d887dc
MD5 1a21597aeebc0094167b000d648ac351
BLAKE2b-256 0a95bc1acb5a29bb9ab38f90cd175f023ac67783bd24fcb44729bc9c1093dcaa

See more details on using hashes here.

File details

Details for the file llama_cloud-2.4.1-py3-none-any.whl.

File metadata

  • Download URL: llama_cloud-2.4.1-py3-none-any.whl
  • Upload date:
  • Size: 417.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.9

File hashes

Hashes for llama_cloud-2.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cef6c28412e5118bebd26b5965f1f0968cba742b6f1f4677e3447696709e0763
MD5 59056880ceba17124fe946dfb33e098c
BLAKE2b-256 27635de1c2fdc8b75e025d01e4d892f90b24404dc19d4d6a4dbb516441d3edf3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page