Skip to main content

A versatile Python package for uploading files to multiple storage backends including local, S3, Azure ADLS, and Blob storage. Supports ZIP file extraction and multiple file uploads including reading files.

Project description

File Uploader

A Python package for seamlessly uploading files to various storage backends (Local, S3, Azure Blob, ADLS) with job tracking, error logging, and comprehensive file management capabilities.

Features

  • Multiple storage backend support:
    • Local filesystem
    • AWS S3
    • Azure Blob Storage
    • Azure Data Lake Storage (ADLS)
  • Robust job tracking with SQLite database
  • Detailed error logging and exception handling
  • Intelligent folder structure with date-based organization
  • Advanced ZIP file support with safety checks:
    • File size validation (100MB per file limit)
    • Total uncompressed size limit (200MB)
    • Zip file integrity verification
  • Custom date handling for file organization
  • Comprehensive error handling and custom exceptions
  • Support for file metadata tracking

Installation

pip install pyspfileuploader

Configuration

The package uses environment variables for configuration. Create a .env file with the required variables based on your chosen storage type:

Local Storage

STORAGE_TYPE=local
BASE_PATH=/path/to/storage

AWS S3

STORAGE_TYPE=s3
AWS_BUCKET_NAME=your-bucket
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
REGION_NAME=us-east-1  # optional

Azure Blob Storage

STORAGE_TYPE=blob
BLOB_CONNECTION_STRING=your-connection-string
BLOB_CONTAINER_NAME=your-container

Azure Data Lake Storage

STORAGE_TYPE=adls
ADLS_ACCOUNT_NAME=your-account
ADLS_FILE_SYSTEM_NAME=your-filesystem
ADLS_CREDENTIAL=your-credential

Usage

Command Line

file-uploader /path/to/file [job_id] [storage_type] [created_date]

Python API

from file_uploader.uploader import upload_file, display_job_details

# Simple upload with automatic job ID generation
job_id = upload_file("/path/to/file")

# Upload with custom job ID and storage type
job_id = upload_file("/path/to/file", job_id="CUSTOM_JOB_ID_001", storage_type="s3")

# Upload with specific created date
job_id = upload_file("/path/to/file", created_date="14-07-2025")

# Upload ZIP file (will be automatically extracted and validated)
job_id = upload_file("/path/to/archive.zip")

# Get status for a given file upload
# Default output format set to JSON mode
display_job_details(job_id=job_id, output_format='dataframe')

Storage Backend Selection

The storage backend is determined by the STORAGE_TYPE environment variable. If not specified, it defaults to "local" storage. You can also override the storage type per upload using the storage_type parameter.

File Size Limits

  • Individual file size limit: 100MB
  • Total uncompressed size limit for ZIP files: 200MB

Job Tracking

Each upload operation is assigned a unique job ID, which can be used to:

  • Track the upload status
  • Retrieve file metadata
  • Access uploaded file locations

Development

  1. Clone the repository
  2. Install poetry: pip install poetry
  3. Install dependencies: poetry install
  4. Run tests: poetry run pytest

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Run tests
  5. Submit a pull request

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyspfileuploader-1.0.12.tar.gz (14.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyspfileuploader-1.0.12-py3-none-any.whl (20.7 kB view details)

Uploaded Python 3

File details

Details for the file pyspfileuploader-1.0.12.tar.gz.

File metadata

  • Download URL: pyspfileuploader-1.0.12.tar.gz
  • Upload date:
  • Size: 14.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.6 Windows/10

File hashes

Hashes for pyspfileuploader-1.0.12.tar.gz
Algorithm Hash digest
SHA256 874d86b1320bb1c60e34a2bb21bf9cf208371f6425b77110622b903ef7fe3b53
MD5 2091a4b3a1425685ab7ed633d742b3d5
BLAKE2b-256 15cbbc4db3eadbc6fd6f99f1bbcc5d13865f3a8dad94fc6b8812aeebc44f8a5b

See more details on using hashes here.

File details

Details for the file pyspfileuploader-1.0.12-py3-none-any.whl.

File metadata

  • Download URL: pyspfileuploader-1.0.12-py3-none-any.whl
  • Upload date:
  • Size: 20.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.6 Windows/10

File hashes

Hashes for pyspfileuploader-1.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 838aed86737c7708ef224d913d4c507756eb59dd2a7a79662de2d2ec3a9c4b1c
MD5 9e619a7a9332918719af712005e3291e
BLAKE2b-256 de5bc89ee2a3f1546e5ff15bc039a031e2a7f7d50c532107537b3d0d18333fbe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page