Benchmarking python functions
Project description
Visualization // Documentation
PyBe - benchmark your Python functions
Benchmark any (Python) function, store (as csv or Excel), read and visualize the results with only a few lines of code!
Table of Contents:
Structure of a benchmark
The general structure of a benchmark script is as follows:
- you have some algorithm
- you want to test the algorithm by varying over a set of inputs
- you assess quantities of interest (i.e. some performance metric) to the output of the algorithm
This can be implemented as a Python function:
def benchmark_function(input):
result = algorithm(input)
return {"name_performance_metric_1": performance_metric_1(result),"name_performance_metric_2": performance_metric_1(result),...}
In order to benchmark your algorithm you simply need to call the above function to all sets of inputs. This and storing your results is taken care of by the Benchmark class in pybe.benchmark. Lets look at a concrete example.
Example: Optimization algorithm
Lets say you have an optimization algorithm implemented in Python which takes as inputs
- a function to be optimized and
- the number of runs.
You want to evaluate the optimizer on a certain test function and benchmark how well the optimizer performs for specific number of runs. For this you have a performance metric which can be called to the output of the optimization and returns a real number (float).
Then, your benchmark function looks as follows:
def benchmark_function(number_of_runs):
result_optimizer = optimizer(test_function,number_of_runs)
return {"name_performance_metric": performance_metric(result_optimizer)}
Lets say you want to benchmark your optimization algorithm for number of runs 10,100 and 1000. Now, you can simply benchmark your optimization algorithm by using the pybe Benchmark class.
from pybe.benchmark import Benchmark
benchmark = Benchmark()
benchmark(function=benchmark_function,inputs=[10,100,1000],name="name_of_my_optimization_algorithm")
Drag the resulting name_of_my_optimization_algorithm.csv into the Dashboard and thats it!
Installation
The official release is available at PyPi:
pip install pybe
You can clone this repository by running the following command:
git clone https://github.com/nicolaipalm/pybe
cd pybe
pip install
Getting started
In order to benchmark a Python function you only need to implement the function and specify some data of the benchmark.
from pybe.benchmark import Benchmark
from pybe.wrappers import timer
import time
benchmark = Benchmark() # initialize pybe's benchmark class
@timer # additionally track the time needed in each iteration
def test_function(i: int):
time.sleep(0.1)
return {"name_of_output": i} # specify the output in a dictionary
# benchmark test_function on inputs [1,2,3] and evaluate each input 10 times
benchmark(test_function,
name="test_benchmark", # set the name of the benchmark
inputs=[1, 2, 3],
store=True, # store the benchmark results
number_runs=10)
Look at the benchmark.csv file in your directory!
You can view the results also directly in Python or write them to an Excel or csv file
print(benchmark.inputs, benchmark.name_outputs) # print inputs and names of outputs
print(benchmark.result) # print results as stored in benchmark.csv
benchmark.to_excel(name="my_results") # write results as excel
benchmark.to_csv(name="my_results") # write results as csv
You can read any of the benchmark results by simply initializing the benchmark class with parameter the .yaml benchmark file path
benchmark = Benchmark(benchmark_file_path)
Structure of benchmark csv
The structure of the resulting csv is supposed to be very intuitive:
- each row represents one call of the benchmarked function with
- one column for the input
- one column with the name of the benchmark
- one column for each output
For example:
- the function has two outputs: time and value
- is benchmarked at inputs 10 and 100
- has name Hello
- is evaluated once for each input
Then, the resulting csv/Excel has the following structure:
value | time | Input | Name | |
---|---|---|---|---|
0 | 0.1 | 1 | 10 | hello |
1 | 0.05 | 20 | 100 | hello |
Dashboard
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pybe-1.0.0.tar.gz
.
File metadata
- Download URL: pybe-1.0.0.tar.gz
- Upload date:
- Size: 6.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 80220eb1f394b421fd5ad23cc80e4de4a5f32054ae8a27000416150e1d5c2a70 |
|
MD5 | 0360fece702c0b5e99f6aead84af8a55 |
|
BLAKE2b-256 | f897b7195ec0b085ed56a68a82c2628b4b34de9815e9f5cdfc136c4ee00aa417 |
File details
Details for the file pybe-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: pybe-1.0.0-py3-none-any.whl
- Upload date:
- Size: 6.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b2ed98f3cf1da4ef568e9fda07c820bc7027c943217a8142dca13ab5857832fa |
|
MD5 | 3cf3b455e91b5b4e5b41aa95a3f8342b |
|
BLAKE2b-256 | 8a7e96371810fcbbcb73d451d0a37231a26f768a604f99beda56eb00638d0bbe |