Skip to main content

Python SDK for the CloudUploader file-upload platform

Project description

CloudUploader Python SDK

PyPI version License: MIT ![GitHub](https://img.shields.io/badge/github-clouduploader%2Fcloudup loader--py-blue)

A production-ready Python SDK for the CloudUploader file upload platform. Upload files to S3, Cloudflare R2, MinIO, Azure Blob, or GCS using presigned URLs — with parallel multipart uploads, automatic retries, and real-time progress tracking.

Quick Start

pip install clouduploader-py
from cloud_uploader import CloudUploader

uploader = CloudUploader(api_key="ck_live_xxx")
result = uploader.upload_file("video.mp4")
print(result.storage_path)
# → r2://my-bucket/ab/cd/1713080000000-a1b2c3-video.mp4

Installation (from source)

For development or building from source:

git clone https://github.com/CloudUploader/clouduploader-py.git
cd clouduploader-py
pip install -e .

# With dev dependencies (for running tests):
pip install -e ".[dev]"

Features

Feature Details
Simple API Two lines to upload any file
Multipart uploads Automatic chunking for large files
Parallel uploads Configurable thread pool (default 5 threads)
Retry with backoff Exponential backoff for transient failures
Progress tracking Real-time callback with bytes uploaded/total
Multiple backends r2, s3, minio, azure, gcs
Download Download files by ID via presigned URLs
Type hints Full type annotations, Python 3.9+

Configuration

uploader = CloudUploader(
    api_key="ck_live_xxx",          # Required
    base_url="https://api.myapp.com",  # Default: http://localhost:8080
    timeout=30,                      # HTTP timeout (seconds)
    max_retries=3,                   # Retry attempts for transient errors
    max_parallel_uploads=5,          # Thread pool size for multipart
    chunk_size_override=None,        # Override backend chunk size (bytes)
    storage="r2",                    # Default storage backend
    debug=False,                     # Enable debug logging
)

Upload with Progress

def progress(uploaded: int, total: int) -> None:
    pct = uploaded / total * 100
    print(f"\r{pct:.1f}%", end="", flush=True)

result = uploader.upload_file("large_video.mp4", progress_callback=progress)

Upload to Specific Backend

result = uploader.upload_file("data.csv", storage="s3")

Upload a Folder

# Recursively upload all files in a directory (skips hidden files by default)
result = uploader.upload_folder("./path/to/assets")

print(f"Succeeded: {result.succeeded}/{result.total_files}")
if result.failures:
    print(f"Failed files: {len(result.failures)}")

# You can also use a glob pattern to filter specific files
result = uploader.upload_folder("./path/to/assets", file_filter="*.png")

Download a File

path = uploader.download_file(file_id="file_123", output_path="./downloads/file.jpg")

Error Handling

from cloud_uploader import (
    CloudUploaderError,
    AuthenticationError,
    UploadInitError,
    UploadFailedError,
)

try:
    result = uploader.upload_file("file.pdf")
except AuthenticationError:
    print("Invalid API key")
except UploadInitError as e:
    print(f"Backend rejected upload: {e.error_code}")
except UploadFailedError as e:
    print(f"Failed parts: {e.failed_parts}")
except CloudUploaderError as e:
    print(f"Error: {e.message} (HTTP {e.status_code})")

Check Upload Status

status = uploader.get_upload_status("up_abc123")
print(status)
# {'success': True, 'upload_id': '...', 'status': 'completed', ...}

Abort an Upload

uploader.abort_upload("up_abc123")

Architecture

cloud_uploader/
├── __init__.py        # Public API re-exports
├── client.py          # CloudUploader — main user-facing class
├── uploader.py        # UploadOrchestrator — direct vs multipart routing
├── multipart.py       # Parallel multipart engine (ThreadPoolExecutor)
├── http_client.py     # HTTP transport with retry + auth
├── utils.py           # MIME types, file validation, formatting
└── exceptions.py      # Exception hierarchy

Development & Testing

Run Tests

pip install -e ".[dev]"
pytest

Test with Your Backend

source .venv/bin/activate
CLOUD_UPLOADER_API_KEY=your_key python examples/basic_upload.py path/to/file --progress

Contributing

We welcome contributions! Please see our GitHub repository for:

  • Issue tracking
  • Pull request guidelines
  • Development setup

Support

For issues, questions, or feedback:

License

MIT License - See LICENSE file for details

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clouduploader_py-0.1.3.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clouduploader_py-0.1.3-py3-none-any.whl (17.5 kB view details)

Uploaded Python 3

File details

Details for the file clouduploader_py-0.1.3.tar.gz.

File metadata

  • Download URL: clouduploader_py-0.1.3.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for clouduploader_py-0.1.3.tar.gz
Algorithm Hash digest
SHA256 05f6608bbca4adada19b53c37776c369dad851cf779098ed9ac1bd5f1a63e8f3
MD5 4e167914f8f03367f07640f5f799f246
BLAKE2b-256 cd4ff6ff78cf21c4df9883e74adac4d99f70a8fe6e814a5d4fbea7c025a220ee

See more details on using hashes here.

File details

Details for the file clouduploader_py-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for clouduploader_py-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 f8d3c17c9db735d2f69d24703855eb2062b3ad13daebb1a109363b4d909847eb
MD5 b386b2c223bd2211b93ed9ad89629dbc
BLAKE2b-256 c74b1f996ef43963af53ee0178d91bdb991b622501d698d33e70ebe9f03638b5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page