Skip to main content

Library to performs SQL string parameterisation and execution.

Project description

stringql

Support for string queries parameterisation and execution, building on Psycopg2.sql module.

How to to use

import stringql

schema = "a_schema"
libpq_string = "dbname=a_db user=a_user password=a_secret"
engine = stringql.start_engine(libpq_string=libpq_string)
conn = engine.connect(schema=schema)  # created if doesn't exist.

q = "select name from {table} where {filter_col} = %s"
filter_val = ("smith",)

curs = engine.do_query(conn, 
                       mode='r', 
                       query=q, 
                       data=filter_val, 
                       table="people", 
                       filter_col="family_name")

for name in curs:  # when reading do_query returns an iterable cursor object
    print(name)  # prints tuples, like: ("john",) ...
curs.close()  # always close your cursor

the connect method

You can connect to the postgres instance by using the libpq string, or the dsn keyword arguments:

  • libpq_string: MydB("dbname=test user=postgres password=secret")
  • dsn kwargs: MyDb(dbname="test", user="postgres", password="secret")
  • schema: defaulted to None, in which case you'll connect to the public schema. Otherwise it'll create the schema and set search_path to it.

the parameterise_query method

The parameterize_query() function forms a Composable object representing a snippet of SQL statement. It all happens behind the curtains when you call the .do_query() method.

  • query: query string. see here for basic rules on how to form strings for the Psycopg2's sql module.
  • data: defaulted to None, pass a dictionary to format an insert statement.
  • kwargs: the fields to pass to the string format method.

Returns parameterised string query, where needed, or original query.

Should you want to just parameterise your query and use Psycopg2 cusror's .execute() method, do the following:

from stringql.pg_engine import parameterize_query

q = "select {cols} from {table} where {filter} = %s"
cols = ["name", "dob"]
table = "people"
filter = "family_name"
paramed_q = parameterize_query(query=q, cols=cols, table=table, filter=filter)

# You need the conn to reprint the Composed object as string
print(paramed_q.as_string(conn))  

# will print to console:
'select "name", "dob" fom "people" where "family_name" = %s'

# execute yourself with your curs object.
data = ("smith", )
curs.execute(parameterized_query, data)

You can also use parameterize_query() with the "data" argument only to prepare a dictionary insert statement.

  • Include in the string query "placeholders" and "fields" so that parameterize_query() can replace them with the parameterised dictionary keys.
  • Use the "drop_keys" optional keyword argument if there are certain keys of the dictionary that you want to drop.
insert_stmt = "insert into {table} ({fields}) values ({placeholders})"

data = {"name": "gianny", "dob": "23/01/1900", "ignore": [1, 2, 3]}
table = "people"

paramed_q = parameterize_query(insert_stmt, data, drop_keys=["ignore"])

print(paramed_q.as_string(conn))  

# will print to console:
'insert into "people" ("name", "dob") values (%(name)s, %(dob)s)'

curs.execute(parameterized_query, data)

The do_query method

The class MyDb would not be complete without a support for execution. Once you started your engine, established a connection - you probably want to run some queries, and you can, with the .do_query() method.

  • conn: connection object returned by .connect() method.
  • mode: "r"(ead) for SELECT, "w"(rite) for INSERT, "wr"(write and read) for INSERT RETURNING.
  • query: string query to be parameterised and executed.
  • data: collection or dictionary containing data for placeholders (if tuple) or fields and placeholders (if dict).
  • kwargs: kwarg to be parameterised and used to form query string.

It returns a psycopg2 iterable cursor object if r or wr mode otherwise None.

Some examples
insert from either tuple or dictionary
get_french_ppl = "select {cols} from {table} where {nat} = % or {place} = %"
cols = ["name", "family_name", "dob", "marital_status"]
data = ("french", "france")
curs = engine.do_query(conn, "r", get_french_ppl, data, nat="nationality", place="place_of_birth")
# curs is now an iterable with your data in it. 

# insert from tuple

insert_from_t = "insert into {table} ({cols}) values (%s, %s)"
data_t = (1, 2)
engine.do_query(conn, "w", insert_from_t, data_t, table="test", cols=['a', 'b'])

# insert from a dict dropping some k:v pairs. 

insert_from_d = "insert into {table} ({fields}) values ({placeholders})"
data_d = {"name": "gianny", "dob": "23/01/1900", "ignore": [1, 2, 3]}
engine.do_query(conn, "w", insert_from_d, data_d, table="test", drop_keys=["ignore"])
do a batch insert

For example, build a statement for a multivalue insert statement, given a list of dictionaries.

from more_itertools import flatten

draft_stmt = "insert into {{table}} ({{placeholders}}) values {multivals}"
d1 = {"a":1, "b":2}
d2 = {"a":3, "b":4}
d_coll = [d1, d2]

def make_multivals(collection):
    height = len(collection)
    length = len(collection[0])
    val_line = "(" + ("%s, " * length).rstrip(", ") + "),"
    multivals = (val_line * height).rstrip(",") + ";"

def make_stmt(draft, d_coll):
    m = make_multivals(collection=d_coll)
    return draft.format(multivals=m)

data = flatten([[x for x in d.values()] for d in d_coll])
q = make_stmt(draft=draft_stmt, d_coll=d_coll)
cols = list(d)

# start your engine and your connection
engine.do_query(conn, "w", q, data, table="a_table", placeholders=cols)
curry do_query to always refer to the same table, with the same connection.
import stringql
from functools import partial

data = {"name": "gianny", "dob": "13/01/2009", "ignore": []}

conn_string = "dbname=postgres user=postgres password=secreeeeetuuuh"
engine = stringql.start_engine(libpq_string=conn_string)
conn = engine.connect(schema="test")
create = ("create table if not exists {table} ("
          "id serial, "
          "name varchar,"
          "dob date)")

insert = "insert into {table} ({fields}) values({placeholders})"
select = "select {col} from {table}"

peopleTable_writer = partial(
    engine.do_query, conn=conn, mode="w", table="people")

peopleTable_reader = partial(
    engine.do_query, conn=conn, mode="r", table="people")

if __name__ == "__main__":
    peopleTable_writer(query=create)
    peopleTable_writer(query=insert, data=data, drop_keys=["ignore"])
    curs = peopleTable_reader(query=select, col="name")

    with curs:
        for record in curs:
            print(record)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

stringql-1.0.0.tar.gz (11.4 kB view details)

Uploaded Source

Built Distribution

stringql-1.0.0-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file stringql-1.0.0.tar.gz.

File metadata

  • Download URL: stringql-1.0.0.tar.gz
  • Upload date:
  • Size: 11.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.8.5 Linux/5.4.0-54-generic

File hashes

Hashes for stringql-1.0.0.tar.gz
Algorithm Hash digest
SHA256 ce0c36cbbf84b2cabaadbe44730fa78e8f6d7d12f60a6887f919aa2da78990e1
MD5 ea7af0983f2546a4c3821fdc9dc2fdc4
BLAKE2b-256 891af7540e8a24d36412cd8ec0259d11d48e721234eb17f0b0d567f22de03464

See more details on using hashes here.

File details

Details for the file stringql-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: stringql-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.8.5 Linux/5.4.0-54-generic

File hashes

Hashes for stringql-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f17178811eb2957a543fb688a930979341765485eedd76f9a29c852b0c02cc20
MD5 022410413af0dbc796fb896eec8fea57
BLAKE2b-256 13e152323e4332e4154c2076483f2e03916e4bf4367db8fc93bce85844fd7480

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page