Skip to main content

Python SDK for the CloudUploader file-upload platform

Project description

CloudUploader Python SDK

A production-ready Python SDK for the CloudUploader file upload platform. Upload files to S3, Cloudflare R2, MinIO, Azure Blob, or GCS using presigned URLs — with parallel multipart uploads, automatic retries, and real-time progress tracking.

Quick Start

pip install clouduploader-py
from cloud_uploader import CloudUploader

uploader = CloudUploader(api_key="ck_live_xxx")
result = uploader.upload_file("video.mp4")
print(result.storage_path)
# → r2://my-bucket/ab/cd/1713080000000-a1b2c3-video.mp4

Installation (from source)

cd sdk/pythonSDK
pip install -e .

# With dev dependencies (for running tests):
pip install -e ".[dev]"

Features

Feature Details
Simple API Two lines to upload any file
Multipart uploads Automatic chunking for large files
Parallel uploads Configurable thread pool (default 5 threads)
Retry with backoff Exponential backoff for transient failures
Progress tracking Real-time callback with bytes uploaded/total
Multiple backends r2, s3, minio, azure, gcs
Download Download files by ID via presigned URLs
Type hints Full type annotations, Python 3.9+

Configuration

uploader = CloudUploader(
    api_key="ck_live_xxx",          # Required
    base_url="https://api.myapp.com",  # Default: http://localhost:8080
    timeout=30,                      # HTTP timeout (seconds)
    max_retries=3,                   # Retry attempts for transient errors
    max_parallel_uploads=5,          # Thread pool size for multipart
    chunk_size_override=None,        # Override backend chunk size (bytes)
    storage="r2",                    # Default storage backend
    debug=False,                     # Enable debug logging
)

Upload with Progress

def progress(uploaded: int, total: int) -> None:
    pct = uploaded / total * 100
    print(f"\r{pct:.1f}%", end="", flush=True)

result = uploader.upload_file("large_video.mp4", progress_callback=progress)

Upload to Specific Backend

result = uploader.upload_file("data.csv", storage="s3")

Upload a Folder

# Recursively upload all files in a directory (skips hidden files by default)
result = uploader.upload_folder("./path/to/assets")

print(f"Succeeded: {result.succeeded}/{result.total_files}")
if result.failures:
    print(f"Failed files: {len(result.failures)}")

# You can also use a glob pattern to filter specific files
result = uploader.upload_folder("./path/to/assets", file_filter="*.png")

Download a File

path = uploader.download_file(file_id="file_123", output_path="./downloads/file.jpg")

Error Handling

from cloud_uploader import (
    CloudUploaderError,
    AuthenticationError,
    UploadInitError,
    UploadFailedError,
)

try:
    result = uploader.upload_file("file.pdf")
except AuthenticationError:
    print("Invalid API key")
except UploadInitError as e:
    print(f"Backend rejected upload: {e.error_code}")
except UploadFailedError as e:
    print(f"Failed parts: {e.failed_parts}")
except CloudUploaderError as e:
    print(f"Error: {e.message} (HTTP {e.status_code})")

Check Upload Status

status = uploader.get_upload_status("up_abc123")
print(status)
# {'success': True, 'upload_id': '...', 'status': 'completed', ...}

Abort an Upload

uploader.abort_upload("up_abc123")

Architecture

cloud_uploader/
├── __init__.py        # Public API re-exports
├── client.py          # CloudUploader — main user-facing class
├── uploader.py        # UploadOrchestrator — direct vs multipart routing
├── multipart.py       # Parallel multipart engine (ThreadPoolExecutor)
├── http_client.py     # HTTP transport with retry + auth
├── utils.py           # MIME types, file validation, formatting
└── exceptions.py      # Exception hierarchy

To test with your running backend:

cd sdk/pythonSDK source .venv/bin/activate CLOUD_UPLOADER_API_KEY=your_key python examples/basic_upload.py path/to/file --progress

Usage is exactly as specified:

from cloud_uploader import CloudUploader

uploader = CloudUploader(api_key="ck_live_xxx") result = uploader.upload_file("video.mp4")

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clouduploader_py-0.1.0.tar.gz (19.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clouduploader_py-0.1.0-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file clouduploader_py-0.1.0.tar.gz.

File metadata

  • Download URL: clouduploader_py-0.1.0.tar.gz
  • Upload date:
  • Size: 19.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for clouduploader_py-0.1.0.tar.gz
Algorithm Hash digest
SHA256 9d7476c3e290c46e7976f1baf8163fe7dd30a32810253a02fc29145f034cb86e
MD5 29f549edf579dfbaa7ad1c7a20ec6a00
BLAKE2b-256 b5e08f1a579d8cee7ec710ac2ff2aa1ac193a7eb20db1cd8e2654deac8e2745f

See more details on using hashes here.

File details

Details for the file clouduploader_py-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for clouduploader_py-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 724f5893d1caac592f498b4b500a2916b2af8fef40456610b27ab0bf3a35b4c2
MD5 903eb0f9c57153fedd235bf393f8ee94
BLAKE2b-256 ee4204fdb8bfdebdda9360e6afcc91ffeca154f43009ae59f58be350e73d0445

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page