Skip to main content

A versatile Python package for uploading files to multiple storage backends including local, S3, Azure ADLS, and Blob storage. Supports ZIP file extraction and multiple file uploads.

Project description

File Uploader

A Python package for seamlessly uploading files to various storage backends (Local, S3, Azure Blob, ADLS) with job tracking, error logging, and comprehensive file management capabilities.

Features

  • Multiple storage backend support:
    • Local filesystem
    • AWS S3
    • Azure Blob Storage
    • Azure Data Lake Storage (ADLS)
  • Robust job tracking with SQLite database
  • Detailed error logging and exception handling
  • Intelligent folder structure with date-based organization
  • Advanced ZIP file support with safety checks:
    • File size validation (100MB per file limit)
    • Total uncompressed size limit (200MB)
    • Zip file integrity verification
  • Custom date handling for file organization
  • Comprehensive error handling and custom exceptions
  • Support for file metadata tracking

Installation

pip install pyspfileuploader

Configuration

The package uses environment variables for configuration. Create a .env file with the required variables based on your chosen storage type:

Local Storage

STORAGE_TYPE=local
BASE_PATH=/path/to/storage

AWS S3

STORAGE_TYPE=s3
AWS_BUCKET_NAME=your-bucket
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
REGION_NAME=us-east-1  # optional

Azure Blob Storage

STORAGE_TYPE=blob
BLOB_CONNECTION_STRING=your-connection-string
BLOB_CONTAINER_NAME=your-container

Azure Data Lake Storage

STORAGE_TYPE=adls
ADLS_ACCOUNT_NAME=your-account
ADLS_FILE_SYSTEM_NAME=your-filesystem
ADLS_CREDENTIAL=your-credential

Usage

Command Line

pyspfileuploader /path/to/file [job_id] [storage_type] [created_date]

Python API

from file_uploader import upload_file, display_job_details

# Simple upload with automatic job ID generation
job_id = upload_file("/path/to/file")

# Upload with custom job ID and storage type
job_id = upload_file("/path/to/file", job_id="CUSTOM_JOB_ID_001", storage_type="s3")

# Upload with specific created date
job_id = upload_file("/path/to/file", created_date="14-07-2025")

# Upload ZIP file (will be automatically extracted and validated)
job_id = upload_file("/path/to/archive.zip")

# Get status for a given file upload
# Default output format set to JSON mode
display_job_details(job_id=job_id, output_format='dataframe')

Storage Backend Selection

The storage backend is determined by the STORAGE_TYPE environment variable. If not specified, it defaults to "local" storage. You can also override the storage type per upload using the storage_type parameter.

File Size Limits

  • Individual file size limit: 100MB
  • Total uncompressed size limit for ZIP files: 200MB

Job Tracking

Each upload operation is assigned a unique job ID, which can be used to:

  • Track the upload status
  • Retrieve file metadata
  • Access uploaded file locations

Development

  1. Clone the repository
  2. Install poetry: pip install poetry
  3. Install dependencies: poetry install
  4. Run tests: poetry run pytest

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Run tests
  5. Submit a pull request

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyspfileuploader-1.0.5.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyspfileuploader-1.0.5-py3-none-any.whl (20.4 kB view details)

Uploaded Python 3

File details

Details for the file pyspfileuploader-1.0.5.tar.gz.

File metadata

  • Download URL: pyspfileuploader-1.0.5.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.6 Windows/10

File hashes

Hashes for pyspfileuploader-1.0.5.tar.gz
Algorithm Hash digest
SHA256 e1cd233cddeeb6affcd8aa5b100860fdd466706fdf2f20198124ddd59fc12688
MD5 ca1477b3cc85c310d3df619a1e77f8d3
BLAKE2b-256 851e4eb8efd8e6ae303e21f2749872ec1e295ecfa59eb01521eb67b34122f222

See more details on using hashes here.

File details

Details for the file pyspfileuploader-1.0.5-py3-none-any.whl.

File metadata

  • Download URL: pyspfileuploader-1.0.5-py3-none-any.whl
  • Upload date:
  • Size: 20.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.6 Windows/10

File hashes

Hashes for pyspfileuploader-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 7ef4cd1ed2f73ca61fc225d4a852ee511c9b3eb82f967506970f812006687b39
MD5 ae53e9b98285dbbae5710e133536ef71
BLAKE2b-256 266fc992eb106e101fc57bcceb092c271fef68ab03d58afbc8ad040de258a121

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page