Python SDK for the CloudUploader file-upload platform
Project description
CloudUploader Python SDK

A production-ready Python SDK for the CloudUploader file upload platform. Upload files to S3, Cloudflare R2, MinIO, Azure Blob, or GCS using presigned URLs — with parallel multipart uploads, automatic retries, and real-time progress tracking.
Quick Start
pip install clouduploader-py
from cloud_uploader import CloudUploader
uploader = CloudUploader(api_key="ck_live_xxx")
result = uploader.upload_file("video.mp4")
print(result.storage_path)
# → r2://my-bucket/ab/cd/1713080000000-a1b2c3-video.mp4
Installation (from source)
For development or building from source:
git clone https://github.com/CloudUploader/clouduploader-py.git
cd clouduploader-py
pip install -e .
# With dev dependencies (for running tests):
pip install -e ".[dev]"
Features
| Feature | Details |
|---|---|
| Simple API | Two lines to upload any file |
| Multipart uploads | Automatic chunking for large files |
| Parallel uploads | Configurable thread pool (default 5 threads) |
| Retry with backoff | Exponential backoff for transient failures |
| Progress tracking | Real-time callback with bytes uploaded/total |
| Multiple backends | r2, s3, minio, azure, gcs |
| Download | Download files by ID via presigned URLs |
| Type hints | Full type annotations, Python 3.9+ |
Configuration
uploader = CloudUploader(
api_key="ck_live_xxx", # Required
base_url="https://api.myapp.com", # Default: http://localhost:8080
timeout=30, # HTTP timeout (seconds)
max_retries=3, # Retry attempts for transient errors
max_parallel_uploads=5, # Thread pool size for multipart
chunk_size_override=None, # Override backend chunk size (bytes)
storage="r2", # Default storage backend
debug=False, # Enable debug logging
)
Upload with Progress
def progress(uploaded: int, total: int) -> None:
pct = uploaded / total * 100
print(f"\r{pct:.1f}%", end="", flush=True)
result = uploader.upload_file("large_video.mp4", progress_callback=progress)
Upload to Specific Backend
result = uploader.upload_file("data.csv", storage="s3")
Upload a Folder
# Recursively upload all files in a directory (skips hidden files by default)
result = uploader.upload_folder("./path/to/assets")
print(f"Succeeded: {result.succeeded}/{result.total_files}")
if result.failures:
print(f"Failed files: {len(result.failures)}")
# You can also use a glob pattern to filter specific files
result = uploader.upload_folder("./path/to/assets", file_filter="*.png")
Download a File
path = uploader.download_file(file_id="file_123", output_path="./downloads/file.jpg")
Error Handling
from cloud_uploader import (
CloudUploaderError,
AuthenticationError,
UploadInitError,
UploadFailedError,
)
try:
result = uploader.upload_file("file.pdf")
except AuthenticationError:
print("Invalid API key")
except UploadInitError as e:
print(f"Backend rejected upload: {e.error_code}")
except UploadFailedError as e:
print(f"Failed parts: {e.failed_parts}")
except CloudUploaderError as e:
print(f"Error: {e.message} (HTTP {e.status_code})")
Check Upload Status
status = uploader.get_upload_status("up_abc123")
print(status)
# {'success': True, 'upload_id': '...', 'status': 'completed', ...}
Abort an Upload
uploader.abort_upload("up_abc123")
Architecture
cloud_uploader/
├── __init__.py # Public API re-exports
├── client.py # CloudUploader — main user-facing class
├── uploader.py # UploadOrchestrator — direct vs multipart routing
├── multipart.py # Parallel multipart engine (ThreadPoolExecutor)
├── http_client.py # HTTP transport with retry + auth
├── utils.py # MIME types, file validation, formatting
└── exceptions.py # Exception hierarchy
Development & Testing
Run Tests
pip install -e ".[dev]"
pytest
Test with Your Backend
source .venv/bin/activate
CLOUD_UPLOADER_API_KEY=your_key python examples/basic_upload.py path/to/file --progress
Contributing
We welcome contributions! Please see our GitHub repository for:
- Issue tracking
- Pull request guidelines
- Development setup
Support
For issues, questions, or feedback:
- 📧 Email: support@clouduploader.io
- 🐛 Issues: GitHub Issues
- 📚 Docs: CloudUploader Documentation
License
MIT License - See LICENSE file for details
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file clouduploader_py-0.1.2.tar.gz.
File metadata
- Download URL: clouduploader_py-0.1.2.tar.gz
- Upload date:
- Size: 20.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7717f029e4173f081cdbef9ac1abb83ffd2dcd27d2c692932bbcf6b10cd37928
|
|
| MD5 |
dca79e81f1ee239ca1b43194089f7bae
|
|
| BLAKE2b-256 |
ac61e230e70e3045ab9db00ef2f2025dabd3830032a3b4a4d667b941e588b229
|
File details
Details for the file clouduploader_py-0.1.2-py3-none-any.whl.
File metadata
- Download URL: clouduploader_py-0.1.2-py3-none-any.whl
- Upload date:
- Size: 17.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
afdc88c349334c1fe4eb6e1611d7ccbacf30fc02c992e811f39425e10d643ccf
|
|
| MD5 |
7450d885bce4cb5f9e8c667022025aff
|
|
| BLAKE2b-256 |
60b1f2125d844e672afd5418499a2d283cc23a28ca3c4b47ea8adb415cfb253a
|