Skip to main content

A modular toolkit for managing project paths, cloud services, and configuration in Python projects

Project description

project-toolkit

A modular Python toolkit for managing project paths, cloud services (Google Cloud, Cloudflare R2), HashiCorp Vault, and configuration — powered by pydantic-settings.

Features

  • ⚙️ Auto-Initialized Settings — Just import and use, no setup needed
  • 🔑 Dynamic Env Access — Read any env var or .env value with settings.env("KEY")
  • 🔐 Vault Integration — Read secrets like vault kv get / vault kv get -field=
  • 📁 Project Path Management — Create and navigate project directory structures
  • ☁️ Cloudflare R2 — Upload/download files with boto3 and requests clients
  • 📊 Google Services — BigQuery, Drive, and Sheets integrations
  • 🌐 WAN IP Tool — Fetch, track, and log public IPv4/IPv6 address changes via HTTP/2 and HTTP/3.
  • 🔔 Notifications — Framework-agnostic Discord alerts for Prefect and Airflow

Installation

# Core only
pip install project-toolkit

# With specific extras
pip install "project-toolkit[google]"
pip install "project-toolkit[cloudflare]"
pip install "project-toolkit[vault]"
pip install "project-toolkit[wan-ip]"
pip install "project-toolkit[notifications]"

# Everything
pip install "project-toolkit[all]"

API Usage Guide

1. Auto-Initialized Settings (No Setup Needed)

Settings auto-initialize on first use — no boilerplate required:

from project_toolkit import settings

# Access any module setting directly
print(settings.cloudflare.r2_access_token)
print(settings.google.google_service_account_json)
print(settings.vault.is_configured)

Environment detection is automatic:

  • If AIRFLOW_HOME is set → loads config/.env.airflow
  • Otherwise → loads config/.env.dev

The config/ directory is located by searching from your current working directory first, then walking up the directory tree. This means it works whether you run from the project root, a subdirectory, or even a Jupyter notebook inside the project.


2. Dynamic Environment Variable Access

Read any env var or .env value — no need to pre-define it in code:

from project_toolkit import settings

# Read API keys (from env vars or .env file)
openai_key = settings.env("OPENAI_API_KEY")
gemini_key = settings.env("GEMINI_API_KEY")
claude_key = settings.env("CLAUDE_API_KEY")

# With a default fallback
db_host = settings.env("DATABASE_HOST", "localhost")
debug = settings.env("DEBUG_MODE", "false")

Priority: Environment variable > .env file value > default

Your config/.env.dev can contain any key:

OPENAI_API_KEY=sk-xxxxx
GEMINI_API_KEY=AIza-xxxxx
CLAUDE_API_KEY=sk-ant-xxxxx
DATABASE_HOST=db.example.com

3. HashiCorp Vault

Read secrets from Vault — same env vars as the vault CLI.

Setup — Add to config/.env.dev:

VAULT_ADDR=http://192.168.12.2:8200
VAULT_TOKEN=hvs.your-vault-token

Read all fields (like vault kv get secret/my-credentials):

from project_toolkit import settings

secret = settings.vault.read_secret("secret/my-credentials")
print(secret)
# {'username': 'admin', 'password': 'secret-password-123'}

Read a single field (like vault kv get -field=OPENAI_API_KEY secret/api):

from project_toolkit import settings

# Single field reads
api_key = settings.vault.read_field("secret/api", "OPENAI_API_KEY")
password = settings.vault.read_field("secret/my-credentials", "password")

# With fallback default
db_pass = settings.vault.read_field("secret/db", "password", "default-pw")

Use the hvac client directly for advanced operations:

from project_toolkit import settings

client = settings.vault.get_vault_client()
if client:
    # Write a secret
    client.secrets.kv.v2.create_or_update_secret(
        path="my-new-secret",
        secret={"api_key": "abc123"},
    )

    # List secrets
    secrets = client.secrets.kv.v2.list_secrets(path="", mount_point="secret")
    print(secrets["data"]["keys"])

4. Explicit .env File Path (Fallback)

If auto-detection cannot find your config/.env.dev — for example, when running from a Jupyter notebook, a different working directory, or a remote environment — you can pass the path explicitly:

from project_toolkit import settings

# Point to your .env file explicitly
settings.configure(env_file="/path/to/your/config/.env.dev")

# Now use settings as normal
print(settings.cloudflare.r2_access_token)
print(settings.env("OPENAI_API_KEY"))

Or via get_settings() directly:

from project_toolkit import get_settings

get_settings.cache_clear()
settings = get_settings(env_file="/path/to/your/config/.env.dev")

5. Manual Settings Override

For testing or custom configurations, construct AppSettings directly:

from project_toolkit.settings.base import AppSettings
from project_toolkit.settings.cloudflare import CloudflareSettings
from project_toolkit.settings.google import GoogleSettings
from project_toolkit.settings.path import PathSettings
from project_toolkit.settings.vault import VaultSettings
from project_toolkit.settings.wan_ip import WanIpSettings
from project_toolkit.ip_tool import WanIP, IPChangeDetector

custom_settings = AppSettings(
    path_config=PathSettings(
        data_dir="/custom/data",
        project_name="my-project",
    ),
    cloudflare=CloudflareSettings(
        r2_access_token="custom-token",
        r2_account_id="custom-id",
    ),
    google=GoogleSettings(),
    vault=VaultSettings(),
    wan_ip=WanIpSettings(),
)

# Use the custom settings
print(custom_settings.cloudflare.r2_access_token)  # "custom-token"
print(custom_settings.env("R2_ACCESS_TOKEN"))       # reads from env/dotenv

To reset the cached auto-initialized singleton:

from project_toolkit import settings, get_settings

# Reset cached settings (e.g., after changing env vars)
get_settings.cache_clear()
settings._reset()

# Next access will re-initialize
print(settings.cloudflare.r2_access_token)

6. Project Path Management

Manage project directory structures with automatic creation:

from project_toolkit import DataPathConfig

# Direct usage
dpc = DataPathConfig(
    project_name="my-project",
    subproject="etl",
    data_dir="/data",
)
print(dpc.data_dir())        # /data
print(dpc.project_dir())     # /data/my-project
print(dpc.sub_project_dir()) # /data/my-project/etl

# Get a timestamped file name and full file path
filename = dpc.get_project_today_file_name("output", "csv")
filepath = dpc.sub_project_dir() / filename  # /data/my-project/etl/output_20260209.csv

From auto-initialized settings (recommended):

from project_toolkit import settings, DataPathConfig

dpc = DataPathConfig.from_settings(
    settings.path_config,
    project_name="my-project",
)

7. Cloudflare R2

from project_toolkit import settings
from project_toolkit.cloudflare.boto3_client import CloudflareBoto3Client

# From auto-initialized settings
client = CloudflareBoto3Client.from_settings(settings.cloudflare)

# Upload a file
client.upload_file("my-bucket", "file.csv", open("/path/to/file.csv", "rb").read())

# List buckets
buckets = client.list_buckets()

8. Google Sheets

from project_toolkit import settings
from project_toolkit.google.sheet import GoogleSheetsManager

sheets = GoogleSheetsManager.from_settings(settings.google)

# Read a sheet as a DataFrame
df = sheets.sheet_df(sheet_id="your-sheet-id")
print(df.head())

9. Google BigQuery

from project_toolkit import settings
from project_toolkit.google.bigquery import BigQueryManager

bq = BigQueryManager.from_settings(settings.google)

# Run a query
df = bq.query_table("SELECT * FROM my_dataset.my_table LIMIT 10")

10. Google Drive

from project_toolkit import settings
from project_toolkit.google.drive import GoogleDriveManager

drive = GoogleDriveManager.from_settings(settings.google)

# List files in a folder
files = drive.list_files(folder_id="your-folder-id")

11. WAN IP Tool (Fetch & Track)

The ip_tool provides a convenient wan_ip object for immediate fetching, or classes for programmatic change detection.

Quick Access (Shorthand)

from project_toolkit.ip_tool import wan_ip

# Property-style (attribute) access
print(wan_ip.ipv4)     # "104.28.236.177"
print(wan_ip.ipv6)     # "2a09:bac5..."
print(wan_ip.h3ipv4)   # HTTP/3 result

# Tab-separated report string
print(wan_ip.report()) # "H2v4: 1.1.1.1  H3v4: 1.1.1.2  v6: 2a09..."

Programmatic Change Detection

Use the IPChangeDetector to maintain a persistent log (history.txt) and a current snapshot (current.json).

from project_toolkit.ip_tool import IPChangeDetector, WanIP

# Robust class-based usage
wan_ip_inst = WanIP(timeout=10)
detector = IPChangeDetector(
    project_name="my-app",
    subproject="office",
    wan_ip=wan_ip_inst
)

# Compare current IPs vs last saved and record if changed
changed, results = detector.check_and_record()

# Get the formatted line that was appended to history
print(detector.get_history_line())

12. Notifications (Discord)

Rich Discord notifications with built-in adapters for Prefect and Airflow.

Each adapter supports zero-config instantiation — just set the matching env var (DISCORD_PREFECT_WEBHOOK_URL or DISCORD_AIRFLOW_WEBHOOK_URL) in your environment or config/.env.dev and the adapter creates a DiscordNotifier automatically.

Prefect Integration

from project_toolkit.notifications.adapters.prefect import PrefectAdapter

# Zero-config — reads DISCORD_PREFECT_WEBHOOK_URL from env / .env
hooks = PrefectAdapter()

@flow(
    on_failure=[hooks.on_failure],
    on_completion=[hooks.on_success],
    on_crashed=[hooks.on_crashed],
)
def my_flow():
    ...

Or pass an explicit notifier:

from project_toolkit.notifications.discord import DiscordNotifier
from project_toolkit.notifications.adapters.prefect import PrefectAdapter

notifier = DiscordNotifier(webhook_url="https://discord.com/api/webhooks/...")
hooks = PrefectAdapter(notifier)

Airflow Integration

from project_toolkit.notifications.adapters.airflow import AirflowAdapter

# Zero-config — reads DISCORD_AIRFLOW_WEBHOOK_URL from env / .env
hooks = AirflowAdapter()

with DAG(
    dag_id="my_dag",
    on_failure_callback=hooks.on_failure,
    on_success_callback=hooks.on_success,
) as dag:
    ...

Or pass an explicit notifier:

from project_toolkit.notifications.discord import DiscordNotifier
from project_toolkit.notifications.adapters.airflow import AirflowAdapter

notifier = DiscordNotifier(webhook_url="https://discord.com/api/webhooks/...")
hooks = AirflowAdapter(notifier)

Manual Context

from project_toolkit.notifications.discord import DiscordNotifier
from project_toolkit.notifications.context import NotificationContext

notifier = DiscordNotifier(webhook_url="https://discord.com/api/webhooks/...")
notifier.notify(NotificationContext(
    status="failure",
    flow_name="custom_task",
    run_name="manual_run_1",
    error_message="Something went wrong",
))

Configuration

Environment Variables

Variable Module Description
R2_ACCESS_TOKEN Cloudflare Cloudflare R2 API token
R2_ACCOUNT_ID Cloudflare Cloudflare account ID
R2_DOMAIN Cloudflare R2 custom domain URL
GOOGLE_SERVICE_ACCOUNT_JSON Google Path to service account JSON
TEST_GOOGLE_SHEET_ID Google Test sheet ID (dev only)
VAULT_ADDR Vault Vault server address (e.g., http://192.168.12.2:8200)
VAULT_TOKEN Vault Vault authentication token
WAN_IP_LOG_DIR WAN IP Directory for IP change logs
DISCORD_PREFECT_WEBHOOK_URL Notifications Discord webhook for Prefect flows
DISCORD_AIRFLOW_WEBHOOK_URL Notifications Discord webhook for Airflow DAGs
AIRFLOW_HOME Core Auto-detected for env switching

Any key in your .env file or environment can be read dynamically via settings.env("KEY") — no need to define it in Python code.

.env Files

Place config files in the config/ directory:

config/
├── .env.dev        # Local development (loaded when no AIRFLOW_HOME)
└── .env.airflow    # Airflow production (loaded when AIRFLOW_HOME is set)

.env File Detection Order

The toolkit searches for config/ in this order:

  1. <cwd>/config/ — current working directory
  2. Walk up from cwd (up to 5 parent levels)
  3. Walk up from the package install location (works for editable installs)

If auto-detection fails, use settings.configure(env_file=...) to specify the path explicitly.

Priority Order

Environment variables > .env files > Vault secrets > defaults

Troubleshooting: .env File Not Found

If settings load with None values, the config/.env.dev file was not found. This commonly happens when:

  • Running from a Jupyter notebook whose working directory differs from the project root.
  • Running a script from a different directory than the project root.
  • The package was installed via pip install (not editable mode) and there's no config/ folder in the current working directory.

Fix: Call configure() before accessing settings:

from project_toolkit import settings

settings.configure(env_file="/absolute/path/to/config/.env.dev")
print(settings.env("OPENAI_API_KEY"))  # ✓ works

Development

# Install with all dependencies
pip install -e ".[all]"

# Run tests
pytest

# Run specific tests
pytest tests/test_settings.py -s -vv
pytest tests/test_vault.py -s -vv
pytest tests/test_auto_settings.py -s -vv

Project Structure

project_toolkit/
├── __init__.py           # Package exports + auto-initialized settings singleton
├── project_path.py       # Project path management (DataPathConfig)
├── settings/
│   ├── __init__.py       # Settings exports
│   ├── base.py           # AppSettings + get_settings() + env()
│   ├── path.py           # PathSettings
│   ├── cloudflare.py     # CloudflareSettings
│   ├── google.py         # GoogleSettings
│   ├── vault.py          # VaultSettings + read_secret() + read_field()
│   └── wan_ip.py         # WanIpSettings
├── cloudflare/
│   ├── boto3_client.py   # R2 via boto3
│   └── requests_client.py # R2 via requests
├── google/
│   ├── bigquery.py       # BigQuery operations
│   ├── drive.py          # Google Drive operations
│   └── sheet.py          # Google Sheets operations
├── ip_tool/
│   ├── __init__.py       # Singleton 'wan_ip' proxy exports
│   ├── wan_ip.py         # WanIP core fetcher (H2, H3, v4, v6)
│   └── detector.py       # IP change detection + persistence log logic
└── notifications/
    ├── context.py        # Shared NotificationContext dataclass
    ├── discord.py        # Core Discord notifier
    └── adapters/
        ├── prefect.py    # Prefect (flow, flow_run, state) adapter
        └── airflow.py    # Airflow (context dict) adapter

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

project_toolkit-1.0.7.tar.gz (49.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

project_toolkit-1.0.7-py3-none-any.whl (41.3 kB view details)

Uploaded Python 3

File details

Details for the file project_toolkit-1.0.7.tar.gz.

File metadata

  • Download URL: project_toolkit-1.0.7.tar.gz
  • Upload date:
  • Size: 49.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for project_toolkit-1.0.7.tar.gz
Algorithm Hash digest
SHA256 06b6855f04fa9d2996d1956b6a847b4fde1e71bbb72875363bab334fd9840e60
MD5 2d88c032218f3692425eea46575d0f77
BLAKE2b-256 04d12c247c51126be35ce5c8f0badfcae65707b90a9cfb8674ad47c1fcd1c106

See more details on using hashes here.

File details

Details for the file project_toolkit-1.0.7-py3-none-any.whl.

File metadata

File hashes

Hashes for project_toolkit-1.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 eb88bff35e38fe9e4b02a6738256ec578bc51966e608f3d11bb827153c6534d1
MD5 7242fa0b1351a6545134b0f4aca47062
BLAKE2b-256 8e52c2e69dcb4bbd434ae5965ab3d6ccf2e2547ccb64098f9edc795bacd5168c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page