Skip to main content

Python SDK for the CloudUploader file-upload platform

Project description

CloudUploader Python SDK

PyPI version License: MIT ![GitHub](https://img.shields.io/badge/github-clouduploader%2Fcloudup loader--py-blue)

A production-ready Python SDK for the CloudUploader file upload platform. Upload files to S3, Cloudflare R2, MinIO, Azure Blob, or GCS using presigned URLs — with parallel multipart uploads, automatic retries, and real-time progress tracking.

Quick Start

pip install clouduploader-py
from cloud_uploader import CloudUploader

uploader = CloudUploader(api_key="ck_live_xxx")
result = uploader.upload_file("video.mp4")
print(result.storage_path)
# → r2://my-bucket/ab/cd/1713080000000-a1b2c3-video.mp4

Installation (from source)

For development or building from source:

git clone https://github.com/CloudUploader/clouduploader-py.git
cd clouduploader-py
pip install -e .

# With dev dependencies (for running tests):
pip install -e ".[dev]"

Features

Feature Details
Simple API Two lines to upload any file
Multipart uploads Automatic chunking for large files
Parallel uploads Configurable thread pool (default 5 threads)
Retry with backoff Exponential backoff for transient failures
Progress tracking Real-time callback with bytes uploaded/total
Multiple backends r2, s3, minio, azure, gcs
Download Download files by ID via presigned URLs
Type hints Full type annotations, Python 3.9+

Configuration

uploader = CloudUploader(
    api_key="ck_live_xxx",          # Required
    base_url="https://api.myapp.com",  # Default: http://localhost:8080
    timeout=30,                      # HTTP timeout (seconds)
    max_retries=3,                   # Retry attempts for transient errors
    max_parallel_uploads=5,          # Thread pool size for multipart
    chunk_size_override=None,        # Override backend chunk size (bytes)
    storage="r2",                    # Default storage backend
    debug=False,                     # Enable debug logging
)

Upload with Progress

def progress(uploaded: int, total: int) -> None:
    pct = uploaded / total * 100
    print(f"\r{pct:.1f}%", end="", flush=True)

result = uploader.upload_file("large_video.mp4", progress_callback=progress)

Upload to Specific Backend

result = uploader.upload_file("data.csv", storage="s3")

Upload a Folder

# Recursively upload all files in a directory (skips hidden files by default)
result = uploader.upload_folder("./path/to/assets")

print(f"Succeeded: {result.succeeded}/{result.total_files}")
if result.failures:
    print(f"Failed files: {len(result.failures)}")

# You can also use a glob pattern to filter specific files
result = uploader.upload_folder("./path/to/assets", file_filter="*.png")

Download a File

path = uploader.download_file(file_id="file_123", output_path="./downloads/file.jpg")

Error Handling

from cloud_uploader import (
    CloudUploaderError,
    AuthenticationError,
    UploadInitError,
    UploadFailedError,
)

try:
    result = uploader.upload_file("file.pdf")
except AuthenticationError:
    print("Invalid API key")
except UploadInitError as e:
    print(f"Backend rejected upload: {e.error_code}")
except UploadFailedError as e:
    print(f"Failed parts: {e.failed_parts}")
except CloudUploaderError as e:
    print(f"Error: {e.message} (HTTP {e.status_code})")

Check Upload Status

status = uploader.get_upload_status("up_abc123")
print(status)
# {'success': True, 'upload_id': '...', 'status': 'completed', ...}

Abort an Upload

uploader.abort_upload("up_abc123")

Architecture

cloud_uploader/
├── __init__.py        # Public API re-exports
├── client.py          # CloudUploader — main user-facing class
├── uploader.py        # UploadOrchestrator — direct vs multipart routing
├── multipart.py       # Parallel multipart engine (ThreadPoolExecutor)
├── http_client.py     # HTTP transport with retry + auth
├── utils.py           # MIME types, file validation, formatting
└── exceptions.py      # Exception hierarchy

Development & Testing

Run Tests

pip install -e ".[dev]"
pytest

Test with Your Backend

source .venv/bin/activate
CLOUD_UPLOADER_API_KEY=your_key python examples/basic_upload.py path/to/file --progress

Contributing

We welcome contributions! Please see our GitHub repository for:

  • Issue tracking
  • Pull request guidelines
  • Development setup

Support

For issues, questions, or feedback:

License

MIT License - See LICENSE file for details

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clouduploader_py-0.1.5.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clouduploader_py-0.1.5-py3-none-any.whl (18.4 kB view details)

Uploaded Python 3

File details

Details for the file clouduploader_py-0.1.5.tar.gz.

File metadata

  • Download URL: clouduploader_py-0.1.5.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for clouduploader_py-0.1.5.tar.gz
Algorithm Hash digest
SHA256 a86d6fb4d8c20b410cb90e6cb7207bf55567ec084ae95f156098c31c3acc4c14
MD5 c7656256fd3da37ef354b61ab95981ef
BLAKE2b-256 5fd5ef95f67f2219fd3962611f4819acdd93f2a0fc200e720f3dea910d915878

See more details on using hashes here.

File details

Details for the file clouduploader_py-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for clouduploader_py-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 55c1ae4af9eb5ffe82721644449d9c69a8802f686d9c9c4dd9e120f6b3f59e85
MD5 ec50e89415b980f4cfd05af6e50818d9
BLAKE2b-256 0f0d26e2035de4516d6e01937d1d4bbaec4b045cfce8235db2454de76b7d9aa1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page