File handling library for creating, saving, and loading various file types (CSV, JSON, JOBLIB, PDF, PARQUET)
Project description
dsr-files
File handling library for creating, saving, and loading various file types (CSV, JSON, JOBLIB, PDF, PARQUET).
Version 3.1.1: Standardized handler path typing around a shared PathLike alias for local, cloud, and string inputs, and updated package version reporting to use installed distribution metadata with a safe fallback.
Features
- CSV: Read and write CSV files with pandas.
- JSON: Save and load JSON data with recursive sanitization; now supports
.jsonl(JSON Lines) for large datasets. - JOBLIB: Serialize Python objects and ML models with joblib.
- Excel: Save and load Excel workbooks; supports .xlsx, .xls, .xlsm, and .xlsb formats.
- PDF: Generate interactive, indexed audit reports with Matplotlib and ReportLab.
- PARQUET: High-performance columnar storage; now supports .pq as a valid logical extension.
- YAML: Save and load YAML files with recursive logic and strict key validation to prevent duplicate entries in configuration files.
- FileType Utilities: The FileType enum now includes
is_valid_extension()for performing logical consistency checks between file names and formats without requiring filesystem access. This is ideal for pre-validating configuration files in ML pipelines.
Installation
pip install dsr-files
Requirements
- Python: >= 3.10
- PyYAML: >= 6.0.2
- Pandas: Required for CSV and Excel operations
- Joblib: Required for object serialization
- dsr-utils: >= 1.6.0
- cloudpathlib: Required for
AnyPathandCloudPathsupport
Optional Dependencies
For Excel support:
pip install dsr-files[excel]
For PDF support:
pip install dsr-files[pdf]
For full cloud support (S3, GCS, Azure)
pip install cloudpathlib[all]
Development Installation
pip install -e ".[dev,excel,pdf]"
Developer Transparency
Note on Parameter Registry: The list of valid parameters for each format can be found in dsr_files/resources/params.yaml. This file serves as the "ground truth" for all safe_call filtering operations.
Usage
Universal Parameter Filtering
All handlers now support safe_call=True. This leverages dsr-utils to filter out incompatible keyword arguments that would otherwise cause TypeErrors in underlying engines like pyarrow or fastparquet.
Any parameters that are not compatible with the specific engine are returned in a rejected dictionary for debugging and audit logging.
The library no longer relies solely on reflection, but uses a "ground truth" registry for engine-specific safety.
CSV Operations
from dsr_files import save_csv, load_csv, create_csv
import pandas as pd
from pathlib import Path
# Create from dictionary
data = {"name": ["Alice", "Bob"], "age": [30, 25]}
df = create_csv(data)
# Save to CSV
full_path, rejected = save_csv(df, Path("."), "data")
# Using safe_call
full_path, rejected = save_csv(df, Path("."), "data", safe_call=True, float_format="%.2f")
# Load from CSV
df, rejected = load_csv(Path("data.csv"))
JSON Operations
from dsr_files import save_json, load_json
from pathlib import Path
data = {"key": "value", "number": 42}
# Save to JSON
full_path, rejected = save_json(data, Path("."), "data")
# Load from JSON
data, rejected = load_json(Path("data.json"))
JOBLIB Operations
from dsr_files import save_joblib, load_joblib
from pathlib import Path
# Save any Python object
model = {"weights": [1, 2, 3], "config": {}}
full_path, rejected = save_joblib(model, Path("."), "model")
# Load from JOBLIB
model, rejected = load_joblib(Path("model.joblib"))
Excel Operations
from dsr_files import save_excel, load_excel, ExcelSheetConfig
from pathlib import Path
import pandas as pd
sales = pd.DataFrame({"region": ["NA", "EU"], "revenue": [120, 95]})
costs = pd.DataFrame({"region": ["NA", "EU"], "cost": [80, 70]})
# Save multi-sheet workbook
full_path, rejected = save_excel(
[
ExcelSheetConfig(data=sales, sheet_name="Sales"),
ExcelSheetConfig(data=costs, sheet_name="Costs"),
],
Path("."),
"report",
)
# Load first sheet
df, rejected = load_excel(Path("report.xlsx"))
PDF Operations (Interactive Reports)
from dsr_files import PDFDocument, PageConfiguration, PageSize, PageOrientation, PageColors
from pathlib import Path
# Configure document style
config = PageConfiguration(
page_size=PageSize.LETTER,
orientation=PageOrientation.PORTRAIT,
colors=PageColors(page_num="#000000", title="#444444"),
margins=(0.07, 0.93, 0.90, 0.10)
)
doc = PDFDocument("Audit Report", config)
page = doc.create_new_page("Summary")
# ... Add Matplotlib content to page.fig ...
doc.render_table_of_contents()
full_path, rejected = doc.save(Path("."), "audit_report")
PARQUET Operations
from dsr_files import save_parquet, load_parquet
import pandas as pd
from pathlib import Path
df = pd.DataFrame({"A": [1, 2, 3], "B": ["x", "y", "z"]})
# Save to Parquet
full_path, rejected = save_parquet(df, Path("."), "data", engine="pyarrow")
# Load from Parquet
df, rejected = load_parquet(Path("data.parquet"))
YAML Operations
from dsr_files import save_yaml, load_yaml
from pathlib import Path
data = {"project": "dsr-orchestrator", "steps": ["ingest", "analyze"]}
# Save to YAML
full_path, rejected = save_yaml(data, Path("config.yaml"))
# Load from YAML using the new UniqueKeyLoader
# This will raise a ConstructorError if duplicate keys are detected,
# protecting your project settings from conflicting edits.
data, rejected = load_yaml(Path("config.yaml"))
Cloud-Native Pathing
dsr-files now supports both local and cloud filesystems (S3, GCS, Azure) out of the box using cloudpathlib. You can pass raw URI strings, pathlib.Path objects, or CloudPath objects directly to any handler.
from dsr_files import save_csv
# Local path
full_path, rejected = save_csv(df, "./data", "local_audit")
# Cloud path (requires cloudpathlib[s3])
full_path, rejected = save_csv(df, "s3://my-bucket/audits", "remote_audit")
Testing
pytest tests/
pytest tests/ --cov=src/dsr_files
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dsr_files-3.1.1.tar.gz.
File metadata
- Download URL: dsr_files-3.1.1.tar.gz
- Upload date:
- Size: 32.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b440c6220a91cb5d591b1d298d7c588e95ccfa26caa488f2299674cf6927f9da
|
|
| MD5 |
7f2edc530fbf207a71e2fe0a90d39ac2
|
|
| BLAKE2b-256 |
4cbef9d4e7ed9e7ee42624c0114ea4cc464112fdfabdae8c35467b02dae1b924
|
Provenance
The following attestation bundles were made for dsr_files-3.1.1.tar.gz:
Publisher:
python-publish.yml on scottroberts140/dsr-files
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
dsr_files-3.1.1.tar.gz -
Subject digest:
b440c6220a91cb5d591b1d298d7c588e95ccfa26caa488f2299674cf6927f9da - Sigstore transparency entry: 1353956098
- Sigstore integration time:
-
Permalink:
scottroberts140/dsr-files@c913f763fb2b7f30223bfd3d0125d28fe4b068c0 -
Branch / Tag:
refs/tags/v3.1.1 - Owner: https://github.com/scottroberts140
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@c913f763fb2b7f30223bfd3d0125d28fe4b068c0 -
Trigger Event:
release
-
Statement type:
File details
Details for the file dsr_files-3.1.1-py3-none-any.whl.
File metadata
- Download URL: dsr_files-3.1.1-py3-none-any.whl
- Upload date:
- Size: 30.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
52f3e8346cd8657b733392fd437a6b2d3ddcfd0d85fb5015abcf9af448c49225
|
|
| MD5 |
4338d3de96642b0b8a10e71225d070b3
|
|
| BLAKE2b-256 |
4199016849057330d406a1c42f705aa15e5f5388e3ff7162d54aa50f8fc1ada0
|
Provenance
The following attestation bundles were made for dsr_files-3.1.1-py3-none-any.whl:
Publisher:
python-publish.yml on scottroberts140/dsr-files
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
dsr_files-3.1.1-py3-none-any.whl -
Subject digest:
52f3e8346cd8657b733392fd437a6b2d3ddcfd0d85fb5015abcf9af448c49225 - Sigstore transparency entry: 1353956179
- Sigstore integration time:
-
Permalink:
scottroberts140/dsr-files@c913f763fb2b7f30223bfd3d0125d28fe4b068c0 -
Branch / Tag:
refs/tags/v3.1.1 - Owner: https://github.com/scottroberts140
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@c913f763fb2b7f30223bfd3d0125d28fe4b068c0 -
Trigger Event:
release
-
Statement type: