Skip to main content

Standalone PostgreSQL database setup and SQL runner with phase and directory modes

Project description

PG Runner: A Robust PostgreSQL Script Runner

Python 3.13+ License: MIT

PG Runner is a standalone, production-ready Python CLI tool for reliably executing SQL scripts against a PostgreSQL database. It's designed to handle the complexities of real-world SQL files, making database setups and migrations safe, repeatable, and easy to debug.

The core of this tool is a sophisticated SQL statement parser that correctly handles semicolons inside comments, string literals, and PostgreSQL-specific dollar-quoted strings, preventing common script-running failures.

Key Features

  • 🛡️ Robust SQL Parser: Intelligently splits SQL files into individual statements, correctly handling:
    • Single-quoted strings ('...') with escapes ('').
    • Double-quoted identifiers ("...") with escapes ("").
    • PostgreSQL dollar-quoted strings ($$...$$ and $tag$...$tag$).
    • Line (--) and block (/* ... */) comments.
  • ⚙️ Transactional Safety: Executes each SQL file within a single transaction by default. If any statement fails, the entire batch is rolled back, ensuring your database remains in a consistent state.
  • 🚀 Two Execution Modes:
    1. phases mode: For structured projects, executes specific, named SQL files (schema.sql, indexes.sql, etc.) in a defined order.
    2. dir mode: A general-purpose mode that finds and executes all .sql files in a directory alphabetically.
  • ⚡ Non-Transactional DDL: Supports autocommit mode for operations that cannot run inside a transaction, such as CREATE INDEX CONCURRENTLY.
  • 🔌 Flexible Configuration: Configure your database connection via a single --db-url, discrete CLI flags (--host, --user, etc.), or standard POSTGRES_* environment variables.

Installation

Install it using pip or pipx. This will make the pg-runner command available in your shell.

pip install pg-runner
# Or
pipx install pg-runner

Usage

The tool is invoked via the pg-runner command, which has two main subcommands: phases and dir.

$ pg-runner --help
usage: pg-runner [-h] [--log-level LOG_LEVEL] [--no-emoji] --db-url DB_URL | --host HOST | ... {phases,dir,...} ...

Standalone PostgreSQL database setup and SQL runner

options:
  -h, --help            show this help message and exit
  --log-level LOG_LEVEL
                        Logging level (e.g., DEBUG, INFO, WARNING, ERROR, CRITICAL). Default: INFO
  --no-emoji            Disable emoji output for CI environments

database connection options:
  --db-url DB_URL       SQLAlchemy DB URL (overrides other connection options). Can also be provided via DATABASE_URL or DB_URL
  --host HOST           DB host (default: POSTGRES_HOST or localhost)
  --port PORT           DB port (default: POSTGRES_PORT or 5432)
  --database DATABASE   Database name (default: POSTGRES_DB/POSTGRES_DATABASE or postgres)
  --user USER           DB user (default: POSTGRES_USER or postgres)
  --password PASSWORD   DB password (default: POSTGRES_PASSWORD or empty)

subcommands:
  {phases,dir,schema,indexes,views,price_analysis_views}
    phases              Run project-style phases (schema, indexes, views, price_analysis_views)
    dir                 Run all .sql files in a directory in alphabetical order
    schema              Run only the schema phase
    indexes             Run only the indexes phase
    views               Run only the views phase

Example 1: phases mode

This mode is ideal for projects with a defined structure.

Directory Structure:

my_project/
└── database/
    ├── schema.sql
    ├── indexes.sql
    └── views.sql

Command:

# Set environment variables for the connection
export POSTGRES_USER=myuser
export POSTGRES_PASSWORD=secret
export POSTGRES_DB=mydb

# Run all phases in the default order
pg-runner phases --db-dir ./database

# Run only the 'schema' phase using the convenience command
pg-runner schema --db-dir ./database

Example 2: dir mode

This mode is for running a collection of SQL scripts in a generic directory.

Directory Structure:

migration_scripts/
├── 01_create_users.sql
├── 02_create_products.sql
└── 03_populate_data.sql

Command:

# Run all .sql files in the directory using a DB URL
pg-runner dir \
  --sql-dir ./migration_scripts \
  --db-url "postgresql://myuser:secret@localhost/mydb"

To run with non-transactional execution (e.g., for concurrent index creation), use --autocommit:

pg-runner dir --sql-dir ./indexes --autocommit

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pg_runner-0.1.2.tar.gz (10.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pg_runner-0.1.2-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file pg_runner-0.1.2.tar.gz.

File metadata

  • Download URL: pg_runner-0.1.2.tar.gz
  • Upload date:
  • Size: 10.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.13.7 Darwin/25.0.0

File hashes

Hashes for pg_runner-0.1.2.tar.gz
Algorithm Hash digest
SHA256 339fbe21ceadf3e6c744cb3a0ccf4e3ff1463cf06fa9e90027640eda0be1d47a
MD5 5908707102651e6cc1fdce0d01cc3fbc
BLAKE2b-256 3b26d372d2b418851a090b61259e81c8a6156522f9b07fb90d6504b7586cee24

See more details on using hashes here.

File details

Details for the file pg_runner-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: pg_runner-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.13.7 Darwin/25.0.0

File hashes

Hashes for pg_runner-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c16f00c4f332ecf4eb1289fe3a516c941503f450a816cc541ed3eff4ffa2f7a6
MD5 52b5aeea047aebfd1fa61ace2a3a3b8a
BLAKE2b-256 229a798578664dbd205abb8693ddc58dd56cf581dccd96152a93462843ecfe5c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page