Skip to main content

A versatile Python package for uploading files to multiple storage backends including local, S3, Azure ADLS, and Blob storage. Supports ZIP file extraction and multiple file uploads.

Project description

File Uploader

A Python package for seamlessly uploading files to various storage backends (Local, S3, Azure Blob, ADLS) with job tracking, error logging, and comprehensive file management capabilities.

Features

  • Multiple storage backend support:
    • Local filesystem
    • AWS S3
    • Azure Blob Storage
    • Azure Data Lake Storage (ADLS)
  • Robust job tracking with SQLite database
  • Detailed error logging and exception handling
  • Intelligent folder structure with date-based organization
  • Advanced ZIP file support with safety checks:
    • File size validation (100MB per file limit)
    • Total uncompressed size limit (200MB)
    • Zip file integrity verification
  • Custom date handling for file organization
  • Comprehensive error handling and custom exceptions
  • Support for file metadata tracking

Installation

pip install pyspfileuploader

Configuration

The package uses environment variables for configuration. Create a .env file with the required variables based on your chosen storage type:

Local Storage

STORAGE_TYPE=local
BASE_PATH=/path/to/storage

AWS S3

STORAGE_TYPE=s3
AWS_BUCKET_NAME=your-bucket
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
REGION_NAME=us-east-1  # optional

Azure Blob Storage

STORAGE_TYPE=blob
BLOB_CONNECTION_STRING=your-connection-string
BLOB_CONTAINER_NAME=your-container

Azure Data Lake Storage

STORAGE_TYPE=adls
ADLS_ACCOUNT_NAME=your-account
ADLS_FILE_SYSTEM_NAME=your-filesystem
ADLS_CREDENTIAL=your-credential

Usage

Command Line

pyspfileuploader /path/to/file [job_id] [storage_type] [created_date]

Python API

from file_uploader import upload_file, display_job_details

# Simple upload with automatic job ID generation
job_id = upload_file("/path/to/file")

# Upload with custom job ID and storage type
job_id = upload_file("/path/to/file", job_id="CUSTOM_JOB_ID_001", storage_type="s3")

# Upload with specific created date
job_id = upload_file("/path/to/file", created_date="14-07-2025")

# Upload ZIP file (will be automatically extracted and validated)
job_id = upload_file("/path/to/archive.zip")

# Get status for a given file upload
# Default output format set to JSON mode
display_job_details(job_id=job_id, output_format='dataframe')

Storage Backend Selection

The storage backend is determined by the STORAGE_TYPE environment variable. If not specified, it defaults to "local" storage. You can also override the storage type per upload using the storage_type parameter.

File Size Limits

  • Individual file size limit: 100MB
  • Total uncompressed size limit for ZIP files: 200MB

Job Tracking

Each upload operation is assigned a unique job ID, which can be used to:

  • Track the upload status
  • Retrieve file metadata
  • Access uploaded file locations

Development

  1. Clone the repository
  2. Install poetry: pip install poetry
  3. Install dependencies: poetry install
  4. Run tests: poetry run pytest

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Run tests
  5. Submit a pull request

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyspfileuploader-1.0.1.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyspfileuploader-1.0.1-py3-none-any.whl (20.4 kB view details)

Uploaded Python 3

File details

Details for the file pyspfileuploader-1.0.1.tar.gz.

File metadata

  • Download URL: pyspfileuploader-1.0.1.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.6 Windows/10

File hashes

Hashes for pyspfileuploader-1.0.1.tar.gz
Algorithm Hash digest
SHA256 769a76a7ed71f510550d8fb0cf5271272301098951a3337f0bc74414f867e805
MD5 21322dbe53fb8b03a7066fd9aaa9d7d8
BLAKE2b-256 126e14578a7c186db19e5c9559ebeeca1079e7e05d4a3f54430d111f7faf1331

See more details on using hashes here.

File details

Details for the file pyspfileuploader-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: pyspfileuploader-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 20.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.6 Windows/10

File hashes

Hashes for pyspfileuploader-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 2e2377ed657e9a3a9dd3897abec8626b90828433606ba8591f0987ed53475ce7
MD5 f5835b054838e6c85ef8002ff1b2b08c
BLAKE2b-256 973f777e067d43e273098ac0743ae71d249e6bd9cc48502889a778a9e298e5c2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page