Skip to main content

A Python package to benchmark query performance on PostgreSQL Database.

Project description

pgbenchmark

PyPI Version PyPI Downloads

Python package to benchmark query performance on a PostgreSQL database. It allows you to measure the execution time of queries over multiple runs, providing detailed metrics about each run's performance.


Installation

pip install pgbenchmark

Example

import psycopg2
from pgbenchmark import Benchmark

conn = psycopg2.connect(
    dbname="postgres",
    user="postgres",
    password="  << Your Password >> ",
    host="localhost",
    port="5432"
)

benchmark = Benchmark(db_connection=conn, number_of_runs=1000)
benchmark.set_sql("SELECT 1;")

for result in benchmark:
    # {'run': X, 'sent_at': <DATETIME WITH MS>, 'duration': '0.000064'}
    pass

""" View Summary """
print(benchmark.get_execution_results())

# {'runs': 1000,
#      'min_time': '0.000576',
#      'max_time': '0.014741',
#      'avg_time': '0.0007',
#      'median_time': '0.000642',
#      'percentiles': {'p25': '0.000612',
#                      'p50': '0.000642',
#                      'p75': '0.000696',
#                      'p99': '0.001331'}
#      }

You can also pass SQL file, instead of query string

benchmark.set_sql("./test.sql")

Interactive | No-Code Mode

Simply run in your terminal:

pgbenchmark

You'll see the ouput

[ http://127.0.0.1:8000 ] Click to open pgbenchmark Interface

img

Configuration on the right, rest is very intuitive.

Pause and Resume buttons are not working for now :(

Example with Parallel execution

⚠️ Please be careful. If you are running on Linux, pgbenchmark will load your cores on 100% !!!⚠️

from pgbenchmark import ParallelBenchmark  # <<-------- NEW IMPORT

conn_params = {
    "dbname": "postgres",
    "user": "postgres",
    "password": "",
    "host": "localhost",
    "port": "5432"
}

n_procs = 20  # Number of Processes (Cores basically)
n_runs_per_proc = 1_000

parallel_bench_pg = ParallelBenchmark(
    num_processes=n_procs,
    number_of_runs=n_runs_per_proc,
    db_connection_info=conn_params
)

parallel_bench_pg.set_sql("SELECT * from information_schema.tables;")  # Same as before

""" Unfortunately, as of now, you can't get execution results on the fly. """

parallel_bench_pg.run()  # RUN THE BENCHMARK 

results_pg = parallel_bench_pg.get_execution_results()
print(results_pg)

Example with Template Engine

From version 0.1.0 pgbenchmark supports simple Template Engine for queries.

import random
import string

from pgbenchmark import ParallelBenchmark

conn_params = {
    "dbname": "postgres",
    "user": "postgres",
    "password": "",
    "host": "localhost",
    "port": "5432"
}

n_procs = 20
n_runs_per_proc = 10


# Generator Function for Random Product Price
def generate_random_price():
    return round(random.randint(10, 1000), 2)


# Generator Function for Random Product Name (String)
def generate_random_string(length=10):
    characters = string.ascii_letters + string.digits
    return ''.join(random.choice(characters) for _ in range(length))


parallel_bench_pg = ParallelBenchmark(
    num_processes=n_procs,
    number_of_runs=n_runs_per_proc,
    db_connection_info=conn_params
)

# Define the SQL Query Template
query = """
            INSERT INTO products (name, price, stock_quantity) VALUES ('{{product_name}}', {{price_value}}, 10);
        """

# ===============================
# Note that similar to Jinja2, you have to define template variables within Query
#   {{product_name}}
#   {{price_value}}
# ===============================

parallel_bench_pg.set_sql(query)

# Set formatters
parallel_bench_pg.set_sql_formatter(for_placeholder="price_value", generator=generate_random_price)
parallel_bench_pg.set_sql_formatter(for_placeholder="product_name", generator=generate_random_string)

# Run Benchmark
if __name__ == '__main__':
    # Run the Parallel Benchmark
    parallel_bench_pg.run()

    results_pg = parallel_bench_pg.get_execution_results()

    throughput = results_pg["throughput_runs_per_sec"]
    avg_time = results_pg["avg_time"]

    print("\n=============================================================================")
    print("                           Benchmark Results                             ")
    print("=============================================================================")
    print(f"Throughput (runs/sec): {throughput}")
    print(f"Average Execution Time (sec): {avg_time}")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pgbenchmark-0.1.6.tar.gz (126.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pgbenchmark-0.1.6-py3-none-any.whl (186.4 kB view details)

Uploaded Python 3

File details

Details for the file pgbenchmark-0.1.6.tar.gz.

File metadata

  • Download URL: pgbenchmark-0.1.6.tar.gz
  • Upload date:
  • Size: 126.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for pgbenchmark-0.1.6.tar.gz
Algorithm Hash digest
SHA256 a4b4f83be6e72a087b482bb96aede44e9c402bcaa60dc6b3c400957ccb05d99b
MD5 6573d4fd33aa38ab3866896cc86f3ebb
BLAKE2b-256 c03e879578b5c3ef00e3e7e65adc2604348832c892cd905b28f1f91eca95ad3a

See more details on using hashes here.

File details

Details for the file pgbenchmark-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: pgbenchmark-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 186.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for pgbenchmark-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 634176993bcc9b6af645358273d73e194d5b6006b616163474d5d18447bd7d20
MD5 321bca8ea9c0ab6153c5904c7a080972
BLAKE2b-256 1eaf955f0aef44f34c1682fe3f4e6c1270bf0b05ef93fdcc7f510b10d8ed802b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page