Skip to main content

File handling library for creating, saving, and loading various file types (CSV, JSON, JOBLIB, PDF, PARQUET)

Project description

dsr-files

PyPI version Python versions License Changelog

File handling library for creating, saving, and loading various file types (CSV, JSON, JOBLIB, PDF, PARQUET).

Version 3.1.0: Introduced Configuration-Driven Parameter Filtering using a centralized YAML registry. Optimized internal utility performance with LRU caching and enhanced Circular Dependency Resolution for more robust library architecture.

Features

  • CSV: Read and write CSV files with pandas.
  • JSON: Save and load JSON data with recursive sanitization; now supports .jsonl (JSON Lines) for large datasets.
  • JOBLIB: Serialize Python objects and ML models with joblib.
  • Excel: Save and load Excel workbooks; supports .xlsx, .xls, .xlsm, and .xlsb formats.
  • PDF: Generate interactive, indexed audit reports with Matplotlib and ReportLab.
  • PARQUET: High-performance columnar storage; now supports .pq as a valid logical extension.
  • YAML: Save and load YAML files with recursive logic and strict key validation to prevent duplicate entries in configuration files.
  • FileType Utilities: The FileType enum now includes is_valid_extension() for performing logical consistency checks between file names and formats without requiring filesystem access. This is ideal for pre-validating configuration files in ML pipelines.

Installation

pip install dsr-files

Requirements

  • Python: >= 3.10
  • PyYAML: >= 6.0.2
  • Pandas: Required for CSV and Excel operations
  • Joblib: Required for object serialization
  • dsr-utils: >= 1.6.0
  • cloudpathlib: Required for AnyPath and CloudPath support

Optional Dependencies

For Excel support:

pip install dsr-files[excel]

For PDF support:

pip install dsr-files[pdf]

For full cloud support (S3, GCS, Azure)

pip install cloudpathlib[all]

Development Installation

pip install -e ".[dev,excel,pdf]"

Developer Transparency

Note on Parameter Registry: The list of valid parameters for each format can be found in dsr_files/resources/params.yaml. This file serves as the "ground truth" for all safe_call filtering operations.

Usage

Universal Parameter Filtering

All handlers now support safe_call=True. This leverages dsr-utils to filter out incompatible keyword arguments that would otherwise cause TypeErrors in underlying engines like pyarrow or fastparquet.

Any parameters that are not compatible with the specific engine are returned in a rejected dictionary for debugging and audit logging.

The library no longer relies solely on reflection, but uses a "ground truth" registry for engine-specific safety.

CSV Operations

from dsr_files import save_csv, load_csv, create_csv
import pandas as pd
from pathlib import Path

# Create from dictionary
data = {"name": ["Alice", "Bob"], "age": [30, 25]}
df = create_csv(data)

# Save to CSV
full_path, rejected = save_csv(df, Path("."), "data")

# Using safe_call
full_path, rejected = save_csv(df, Path("."), "data", safe_call=True, float_format="%.2f")

# Load from CSV
df, rejected = load_csv(Path("data.csv"))

JSON Operations

from dsr_files import save_json, load_json
from pathlib import Path

data = {"key": "value", "number": 42}

# Save to JSON
full_path, rejected = save_json(data, Path("."), "data")

# Load from JSON
data, rejected = load_json(Path("data.json"))

JOBLIB Operations

from dsr_files import save_joblib, load_joblib
from pathlib import Path

# Save any Python object
model = {"weights": [1, 2, 3], "config": {}}
full_path, rejected = save_joblib(model, Path("."), "model")

# Load from JOBLIB
model, rejected = load_joblib(Path("model.joblib"))

Excel Operations

from dsr_files import save_excel, load_excel, ExcelSheetConfig
from pathlib import Path
import pandas as pd

sales = pd.DataFrame({"region": ["NA", "EU"], "revenue": [120, 95]})
costs = pd.DataFrame({"region": ["NA", "EU"], "cost": [80, 70]})

# Save multi-sheet workbook
full_path, rejected = save_excel(
 [
  ExcelSheetConfig(data=sales, sheet_name="Sales"),
  ExcelSheetConfig(data=costs, sheet_name="Costs"),
 ],
 Path("."),
 "report",
)

# Load first sheet
df, rejected = load_excel(Path("report.xlsx"))

PDF Operations (Interactive Reports)

from dsr_files import PDFDocument, PageConfiguration, PageSize, PageOrientation, PageColors
from pathlib import Path

# Configure document style
config = PageConfiguration(
    page_size=PageSize.LETTER,
    orientation=PageOrientation.PORTRAIT,
    colors=PageColors(page_num="#000000", title="#444444"),
    margins=(0.07, 0.93, 0.90, 0.10)
)

doc = PDFDocument("Audit Report", config)
page = doc.create_new_page("Summary")
# ... Add Matplotlib content to page.fig ...

doc.render_table_of_contents()
full_path, rejected = doc.save(Path("."), "audit_report")

PARQUET Operations

from dsr_files import save_parquet, load_parquet
import pandas as pd
from pathlib import Path

df = pd.DataFrame({"A": [1, 2, 3], "B": ["x", "y", "z"]})

# Save to Parquet
full_path, rejected = save_parquet(df, Path("."), "data", engine="pyarrow")

# Load from Parquet
df, rejected = load_parquet(Path("data.parquet"))

YAML Operations

from dsr_files import save_yaml, load_yaml
from pathlib import Path

data = {"project": "dsr-orchestrator", "steps": ["ingest", "analyze"]}

# Save to YAML
full_path, rejected = save_yaml(data, Path("config.yaml"))

# Load from YAML using the new UniqueKeyLoader
# This will raise a ConstructorError if duplicate keys are detected,
# protecting your project settings from conflicting edits.
data, rejected = load_yaml(Path("config.yaml"))

Cloud-Native Pathing

dsr-files now supports both local and cloud filesystems (S3, GCS, Azure) out of the box using cloudpathlib. You can pass raw URI strings, pathlib.Path objects, or CloudPath objects directly to any handler.

from dsr_files import save_csv

# Local path
full_path, rejected = save_csv(df, "./data", "local_audit") 

# Cloud path (requires cloudpathlib[s3])
full_path, rejected = save_csv(df, "s3://my-bucket/audits", "remote_audit")

Testing

pytest tests/
pytest tests/ --cov=src/dsr_files

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dsr_files-3.1.0.tar.gz (32.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dsr_files-3.1.0-py3-none-any.whl (30.3 kB view details)

Uploaded Python 3

File details

Details for the file dsr_files-3.1.0.tar.gz.

File metadata

  • Download URL: dsr_files-3.1.0.tar.gz
  • Upload date:
  • Size: 32.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for dsr_files-3.1.0.tar.gz
Algorithm Hash digest
SHA256 b3962fd217fccbd34ec111d851966e26544b0f9bb350d858ed9fa0b4e5c78a14
MD5 d218448aa2ec22b894770c33627f5ad1
BLAKE2b-256 2155233f3c7abc5eab6b6e6c41ab9496ab95941f30df37155a718442bea9c467

See more details on using hashes here.

Provenance

The following attestation bundles were made for dsr_files-3.1.0.tar.gz:

Publisher: python-publish.yml on scottroberts140/dsr-files

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file dsr_files-3.1.0-py3-none-any.whl.

File metadata

  • Download URL: dsr_files-3.1.0-py3-none-any.whl
  • Upload date:
  • Size: 30.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for dsr_files-3.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 747b06f7be93bc193be90a205ed54259f3cf1e5a9555f256d06ca89ed3ec1798
MD5 bb265427dfbfa6b256b038cc9d62f0cd
BLAKE2b-256 4d46d73fd6590d938eef1c604e8ce115213b0351064a6cc58655c3c711090254

See more details on using hashes here.

Provenance

The following attestation bundles were made for dsr_files-3.1.0-py3-none-any.whl:

Publisher: python-publish.yml on scottroberts140/dsr-files

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page