Skip to main content

No project description provided

Project description

What is this?

A JSON-based format for describing paths in your project.

Why is this?

My etl/data analysis scripts are littered with code like,

import os

DATA_DIR = "data"
CLEAN_DIR = os.path.join(DATA_DIR, "clean")
RAW_DIR = os.path.join(DATA_DIR, "raw")
TARGET_HTML = os.path.join(RAW_DIR, "something.html")
OUTPUT_FILE = os.path.join(CLEAN_DIR, "something.csv")

with open(TARGET_HTML) as fp:
    csv = process(fp)

with open(OUTPUT_FILE) as fp:

It’s fine for one file, but when you have a whole ELT pipeline tucked into a Makefile, the duplication leads to fragility and violates DRY. It’s a REALLY common pattern in file-based processing. This package and format lets you do create a paths.json file like,

    "DATA_DIR": ["data"],
    "CLEAN_DIR": ["$DATA_DIR", "clean"],
    "RAW_DIR": ["$DATA_DIR", "raw"],
    "SOMETHING_HTML": ["$RAW_DIR", "something.html"],
    "SOMETHING_CSV": ["$RAW_DIR", "something.csv"]

Then, from your python script,

from pathsjson.automagic import PATHS

with open(PATHS['SOMETHING_HTML']) as fp:
    csv = process(fp)

with open(PATHS['SOMETHING_CSV']) as fp:


pip install pathsjson

More details

Read the docs: here.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for pathsjson, version 0.0.2
Filename, size File type Python version Upload date Hashes
Filename, size pathsjson-0.0.2.tar.gz (6.7 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page