Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

Postgresql shortcuts

Project Description

# pgdata

Python PostgreSQL-PostGIS-SQLAlchemy shortcuts.

[![Build Status](]( [![Coverage Status](](

pgdata is a collection of convenience functions for working with PostgreSQL:

- provides an dictionary/JSON-like shortcut interface to database objects without dealing directly with an ORM or cursor (see [dataset](

>>> import pgdata
>>> db = pgdata.connect()
>>> db.tables
>>> db["inventory"].columns
['type', 'supplier', 'cost']

- provides a shortcut to `ogr2ogr` for quickly getting geographic data in and out of your database with sensible defaults and without resorting to shell scripting

>>> import pgdata
>>> db = pgdata.connect()
>>> db.ogr2pg('airports.shp',
>>> db.execute('do stuff')
>>> db.pg2ogr('SELECT * FROM airports_project.result','GPKG', 'output.gpkg')

Much is copied directly from [dataset]( and further inspiration was taken from [pgwrap]( See also [records]( and many others.

## Requirements

- PostgreSQL
- PostGIS
- GDAL (optional, for `pg2ogr` and `ogr2pg`)
- [ESRI File Geodatabase API]( (optional, for using `pg2ogr` with `FileGDB` option)

## Installation

pip install pgdata

## Configuration

Create an environment variable `DATABASE_URL` and set it to the [SQLAlchemy db url]( for your database:

MacOS/Linux etc:

`export DATABASE_URL=postgresql://postgres:postgres@localhost:5432/mydb`


`SET DATABASE_URL="postgresql://postgres:postgres@localhost:5432/mydb"`

## Usage

>>> import pgdata
>>> db = pgdata.connect(schema='myschema')
>>> db.tables
>>> db["inventory"].columns
['type', 'supplier', 'cost']
>>> data = db.query("SELECT * FROM inventory WHERE type = %s", ('spam',)).fetchall()
>>> for row in data:
>>> print (row['type'], row['supplier'], row['cost'])
('spam', 'spamcorp', 100)
>>> for row in db["inventory"].find(type='spam'):
>>> print (row['type'], row['supplier'], row['cost'])
('spam', 'spamcorp', 100)

## Paired with [`bcdata`](

Try some basic spatial analysis - how many airports are in the CRD?

import os

import bcdata
import pgdata

# define data to download
airports = 'bc-airports'
regdist = 'regional-districts-legally-defined-administrative-areas-of-bc'

# connect to default database, as defined by $DATABASE_URL
db = pgdata.connect()

# download and load to postgres
for url in [airports, regdist]:
# what are the official schema and table names of the data source?
info =
schema, table = (info['schema'], info['table'])

# grab default email address for DataBC downloads
email = os.environ['BCDATA_EMAIL']

# download the data and use pgdata's ogr2pg shortcut to load to postgres
# Note that we assume that the name of the layer in downloaded .gdb
# is 'schema_table'
dl =, email)
db.ogr2pg(dl, in_layer=schema+'_'+table, out_layer=table, schema=schema)

# define the query
sql = """SELECT COUNT(*)
FROM whse_imagery_and_base_maps.gsr_airports_svw a
INNER JOIN whse_legal_admin_boundaries.abms_regional_districts_sp rd
ON ST_Intersects(a.geom, rd.geom)
WHERE rd.admin_area_name = 'Capital Regional District'

# execute and print results


## bc2pg

The module includes `bc2pg`, a command line tool that operates as a shortcut for the `bcdata` operations above:

$ bc2pg --help
Usage: bc2pg [OPTIONS] DATASET

Mirror a DataBC Catalogue dataset in postgres

--email TEXT Email address. Default: $BCDATA_EMAIL
--db_url TEXT Target database Default: $DATABASE_URL
--help Show this message and exit.

To download bc-airports and load to postgres table `whse_imagery_and_base_maps.gsr_airports_svw`:

`$ bc2pg bc-airports`

Release History

This version
History Node


History Node


History Node


History Node


Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

File Name & Hash SHA256 Hash Help Version File Type Upload Date
(17.3 kB) Copy SHA256 Hash SHA256
py2.py3 Wheel Feb 16, 2018
(57.2 kB) Copy SHA256 Hash SHA256
Source Feb 16, 2018

Supported By

WebFaction WebFaction Technical Writing Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Rackspace Rackspace Cloud Servers DreamHost DreamHost Log Hosting