Skip to main content

File handling library for creating, saving, and loading various file types (CSV, JSON, JOBLIB, PDF, PARQUET)

Project description

dsr-files

PyPI version Python versions License Changelog

File handling library for creating, saving, and loading various file types (CSV, JSON, JOBLIB, PDF, PARQUET).

Version 2.1.0: Added YAML support.

Features

  • CSV: Read and write CSV files with pandas
  • JSON: Save and load JSON data with recursive sanitization for NumPy/Pandas types
  • JOBLIB: Serialize Python objects and ML models with joblib
  • Excel: Save and load Excel workbooks (single or multi-sheet)
  • PDF: Generate interactive, indexed audit reports with Matplotlib and ReportLab
  • PARQUET: High-performance columnar storage using PyArrow or FastParquet
  • YAML: Save and load YAML files, relying on recursive logic for nested dictionaries.

Installation

pip install dsr-files

Optional Dependencies

For Excel support:

pip install dsr-files[excel]

For PDF support:

pip install dsr-files[pdf]

Development Installation

pip install -e ".[dev,excel,pdf]"

Usage

CSV Operations

from dsr_files import save_csv, load_csv, create_csv
import pandas as pd
from pathlib import Path

# Create from dictionary
data = {"name": ["Alice", "Bob"], "age": [30, 25]}
df = create_csv(data)

# Save to CSV
save_csv(df, Path("."), "data")

# Load from CSV
df = load_csv(Path("data.csv"))

JSON Operations

from dsr_files import save_json, load_json
from pathlib import Path

data = {"key": "value", "number": 42}

# Save to JSON
save_json(data, Path("."), "data")

# Load from JSON
data = load_json(Path("data.json"))

JOBLIB Operations

from dsr_files import save_joblib, load_joblib
from pathlib import Path

# Save any Python object
model = {"weights": [1, 2, 3], "config": {}}
save_joblib(model, Path("."), "model")

# Load from JOBLIB
model = load_joblib(Path("model.joblib"))

Excel Operations

from dsr_files import save_excel, load_excel, ExcelSheetConfig
from pathlib import Path
import pandas as pd

sales = pd.DataFrame({"region": ["NA", "EU"], "revenue": [120, 95]})
costs = pd.DataFrame({"region": ["NA", "EU"], "cost": [80, 70]})

# Save multi-sheet workbook
save_excel(
 [
  ExcelSheetConfig(data=sales, sheet_name="Sales"),
  ExcelSheetConfig(data=costs, sheet_name="Costs"),
 ],
 Path("."),
 "report",
)

# Load first sheet
df = load_excel(Path("report.xlsx"))

PDF Operations (Interactive Reports)

from dsr_files import PDFDocument, PageConfiguration, PageSize, PageOrientation, PageColors
from pathlib import Path

# Configure document style
config = PageConfiguration(
    page_size=PageSize.LETTER,
    orientation=PageOrientation.PORTRAIT,
    colors=PageColors(page_num="#000000", title="#444444"),
    margins=(0.07, 0.93, 0.90, 0.10)
)

doc = PDFDocument("Audit Report", config)
page = doc.create_new_page("Summary")
# ... Add Matplotlib content to page.fig ...

doc.render_table_of_contents()
doc.save(Path("."), "audit_report")

PARQUET Operations

from dsr_files import save_parquet, load_parquet
import pandas as pd
from pathlib import Path

df = pd.DataFrame({"A": [1, 2, 3], "B": ["x", "y", "z"]})

# Save to Parquet
save_parquet(df, Path("."), "data", engine="pyarrow")

# Load from Parquet
df = load_parquet(Path("data.parquet"))

YAML Operations

from dsr_files import save_yaml, load_yaml
from pathlib import Path

data = {"project": "dsr-orchestrator", "steps": ["ingest", "analyze"]}

# Save to YAML
save_yaml(data, Path("config.yaml"))

# Load from YAML
data = load_yaml(Path("config.yaml"))

Testing

pytest tests/
pytest tests/ --cov=src/dsr_files

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dsr_files-2.1.0.tar.gz (22.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dsr_files-2.1.0-py3-none-any.whl (21.5 kB view details)

Uploaded Python 3

File details

Details for the file dsr_files-2.1.0.tar.gz.

File metadata

  • Download URL: dsr_files-2.1.0.tar.gz
  • Upload date:
  • Size: 22.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for dsr_files-2.1.0.tar.gz
Algorithm Hash digest
SHA256 27523f57240b75dbcea19b1c54c86e0564c27869fdf25958f54b86fd55a0a345
MD5 5ec8d9b9dd81b3243619b73444535631
BLAKE2b-256 c9dab316b9a4bc066f68e40e83786dfab6280090970df08872f50de1e8a5ed73

See more details on using hashes here.

Provenance

The following attestation bundles were made for dsr_files-2.1.0.tar.gz:

Publisher: python-publish.yml on scottroberts140/dsr-files

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file dsr_files-2.1.0-py3-none-any.whl.

File metadata

  • Download URL: dsr_files-2.1.0-py3-none-any.whl
  • Upload date:
  • Size: 21.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for dsr_files-2.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cfd7bb9f8ecbc2db9a177e8eef4e3cf73aca1b2df8edeb686bde56fce9520c8e
MD5 329d0c17939d0f543a0fe323e95fb357
BLAKE2b-256 0c01be540ee0316dd522fa4741caa553b08c1ca61b1741e20870d30d332eb61b

See more details on using hashes here.

Provenance

The following attestation bundles were made for dsr_files-2.1.0-py3-none-any.whl:

Publisher: python-publish.yml on scottroberts140/dsr-files

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page