Skip to main content

A Python package to benchmark query performance and comparison on PostgreSQL Database

Project description

pgbenchmark

codecov PyPI Version PyPI Downloads

Python package to benchmark query performance on a PostgreSQL database. It allows you to measure the execution time of queries over multiple runs, providing detailed metrics about each run's performance.



Installation

pip install pgbenchmark

Example

import psycopg2
from pgbenchmark import Benchmark

conn = psycopg2.connect(
    "<< YOUR CONNECTION >>"
)

benchmark = Benchmark(db_connection=conn, number_of_runs=1000)
benchmark.set_sql("./test.sql")

for result in benchmark:
    # {'run': X, 'sent_at': <DATETIME WITH MS>, 'duration': '0.000064'}
    pass

""" View Summary """
print(benchmark.get_execution_results())
# {'runs': 1000, 'min_time': '0.00005', 'max_time': '0.000287', 'avg_time': '0.000072'}

You can also pass raw SQL as a String, instead of file

benchmark.set_sql("SELECT 1;")

It also supports SQLAlchemy connection engine

engine = create_engine("postgresql+psycopg2://.......")
conn = engine.connect()

# Set up benchmark class
benchmark = Benchmark(db_connection=conn, number_of_runs=5)

Example with Parallel or Threaded execution

⚠️ Please be careful. If you are running on Linux, pgbenchmark will load your cores on 100% !!!⚠️

from pgbenchmark import ParallelBenchmark  # <<-------- NEW IMPORT

conn_params = {
    "dbname": "postgres",
    "user": "postgres",
    "password": "",
    "host": "localhost",
    "port": "5432"
}

n_procs = 20  # Number of Processes (Cores basically)
n_runs_per_proc = 1_000

parallel_bench_pg = ParallelBenchmark(
    num_processes=n_procs,
    number_of_runs=n_runs_per_proc,
    db_connection_info=conn_params
)

parallel_bench_pg.set_sql("SELECT * from information_schema.tables;")  # Same as before

""" Unfortunately, as of now, you can't get execution results on the fly. """

parallel_bench_pg.run()  # RUN THE BENCHMARK 

results_pg = parallel_bench_pg.get_execution_results()
print(results_pg)

Example with Template Engine

From version 0.1.0 pgbenchmark supports simple Template Engine for queries.

import random
import string

from pgbenchmark import ParallelBenchmark

conn_params = {
    "dbname": "postgres",
    "user": "postgres",
    "password": "asdASD123",
    "host": "localhost",
    "port": "5432"
}

n_procs = 20
n_runs_per_proc = 10


# Generator Function for Random Product Price
def generate_random_price():
    return round(random.randint(10, 1000), 2)


# Generator Function for Random Product Name (String)
def generate_random_string(length=10):
    characters = string.ascii_letters + string.digits
    return ''.join(random.choice(characters) for _ in range(length))


parallel_bench_pg = ParallelBenchmark(
    num_processes=n_procs,
    number_of_runs=n_runs_per_proc,
    db_connection_info=conn_params
)

# Define the SQL Query Template
query = """
            INSERT INTO products (name, price, stock_quantity) VALUES ('{{product_name}}', {{price_value}}, 10);
        """

# ===============================
# Note that similar to Jinja2, you have to define template variables within Query
#   {{product_name}}
#   {{price_value}}
# ===============================

parallel_bench_pg.set_sql(query)

# Set formatters
parallel_bench_pg.set_sql_formatter(for_placeholder="price_value", generator=generate_random_price)
parallel_bench_pg.set_sql_formatter(for_placeholder="product_name", generator=generate_random_string)


# Run Benchmark
if __name__ == '__main__':
    # Run the Parallel Benchmark
    parallel_bench_pg.run()

    results_pg = parallel_bench_pg.get_execution_results()

    throughput = results_pg["throughput_runs_per_sec"]
    avg_time = results_pg["avg_time"]

    print("\n=============================================================================")
    print("                           Benchmark Results                             ")
    print("=============================================================================")
    print(f"Throughput (runs/sec): {throughput}")
    print(f"Average Execution Time (sec): {avg_time}")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pgbenchmark-0.1.3.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pgbenchmark-0.1.3-py3-none-any.whl (41.6 kB view details)

Uploaded Python 3

File details

Details for the file pgbenchmark-0.1.3.tar.gz.

File metadata

  • Download URL: pgbenchmark-0.1.3.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for pgbenchmark-0.1.3.tar.gz
Algorithm Hash digest
SHA256 0dead812b11b528dc3cdca893c81eb4d8a3a5bcce6ff89cc7f8a0a8fc204b18e
MD5 dea4e538a2273f75161de0f6b3b18b0b
BLAKE2b-256 30cac3c555530a1fbb7877a002f43a148870890ad4a37476d84bdae2e086bb43

See more details on using hashes here.

File details

Details for the file pgbenchmark-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: pgbenchmark-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 41.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for pgbenchmark-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 fb26d3acebd2fdab8f2689a2c554373311c2b6c304323c1e7cab335bde7e53b0
MD5 bd9cc0ada4d2137283d3c0730c9cd226
BLAKE2b-256 88b6c2947646433c5002e80fb45c211c3060f94076bd7b415c525034049255d0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page