Skip to main content

Estimate Asymptotic Runtime Complexity from Bytecode executions

Project description

The Python Performance Analysis Library (py-pal) is a profiling tool for the Python programming language. With py-pal one can approximate the time complexity (big O notation) of Python functions in an empirical way. The arguments of the function and the executed opcodes serve as a basis for the analysis.

To the docs.

Installation

Requirements

Install py-pal via pip by running:

This project requires CPython and a C compiler to run. Install CPython >= 3.7, then install py-pal by running:

pip install py-pal

or

python -m pip install py-pal

Command line usage of the py-pal module

python -m py_pal <target-module/file>

or

py-pal <target-module/file>

There are multiple aliases to the same command: py-pal, py_pal and pypal. If py-pal is executed this way, all functions called in the code are captured and analyzed. The output is in the form of a pandas data frame.

See the help message:

py-pal -h

Programmatic usage of the py-pal module

To profile a single function and get the complexity estimate there is profile_function.

from py_pal.core import profile_function
from py_pal.data_collection.opcode_metric import OpcodeMetric
from py_pal.datagen import gen_random_growing_lists
from algorithms.sort import bubble_sort

profile_function(OpcodeMetric(), gen_random_growing_lists(), bubble_sort)

The profile decorator:

from py_pal.core import profile, DecoratorStore

@profile
def test():
    pass

# Must be called at some point
test()

estimator = AllArgumentEstimator(DecoratorStore.get_call_stats(), DecoratorStore.get_opcode_stats())
res = estimator.export()

By using the profile decorator, it is possible to annotate Python functions such that only the annotated Python functions will be profiled. It acts similar to a whitelist filter.

Another possibility is to use the context-manager protocol:

from py_pal.analysis.estimator import AllArgumentEstimator
from py_pal.data_collection.tracer import Tracer

with Tracer() as t:
    pass

estimator = AllArgumentEstimator(t.get_call_stats(), t.get_opcode_stats())
res = estimator.export()

# Do something with the resulting DataFrame
print(res)

The most verbose way to use the py-pal API:

from py_pal.analysis.estimator import AllArgumentEstimator
from py_pal.data_collection.tracer import Tracer


t = Tracer()
t.trace()

# Your function
pass

t.stop()
estimator = AllArgumentEstimator(t.get_call_stats(), t.get_opcode_stats())
res = estimator.export()

# Do something with the resulting DataFrame
print(res)

All examples instantiate a tracer object that is responsible for collecting the data. After execution, the collected data is passed to the analysis module. Finally, an estimate of the asymptotic runtime of the functions contained in the code is obtained.

Modes

In the current version py-pal offers only the profiling mode. Although py_pal.datagen offers some functions for generating inputs, py-pal must be combined with appropriate test cases to realize a performance testing mode. An automatic detection and generation of appropriate test inputs does not exist at the moment.

Limitations

The profiling approach implemented by the py-pal modules does not distinguish between different threads executing a Python function. Actually it is a major problem to profile a Python script which makes use of threads. The bytecode counting strategy will increase all counters of Python functions on the current call stack no matter what threads is executing it. Thus, the data points will not be accurate to what really happened during the profiled execution of the script.

Licensing Notes

This work integrates some code from the big_O project. More specifically, most code in py_pal.analysis.complexity, py_pal.datagen and py_pal.analysis.estimator.Estimator.infer_complexity is adapted from bigO.

Changelog

What’s New in Py-PAL 1.1.0

  • Improved Data Collection: The heuristic for determining the size of function arguments has been improved.
  • More tests
  • More documentation
  • More argument generation functions in py_pal.datagen
  • Replaced command line option –debug with –log-level for more configurable log output

Refactoring

Project structure changes, overall CLI interface is unchanged. API changes:

  • py_pal.tracer moved to py_pal.data_collection.tracer
  • py_pal.complexity and py_pal.estimator moved to the py_pal.analysis package.
  • py_pal.analysis.estimator.Estimator now takes call and opcode stats as arguments.

Py-PAL 1.0.0

  • More thorough testing from different combinations of requirements and Python versions.
  • Bug fixes

Py-PAL 0.2.1

Refactoring

The estimator module was refactored which introduces a slight change to the API. Classes inheriting from Estimator now only specify how to transform the collected data with respect to the arguments of the function.

Instead of ComplexityEstimator you should use the AllArgumentEstimator class. Additionally there is the SeparateArgumentEstimator which is experimental.

Py-PAL 0.1.6

More accurate Data Collection

The Tracer is enhanced by measuring builtin function calls with AdvancedOpcodeMetric.

Opcodes resembling a function call .e.g FUNCTION_CALL are filtered for built in function calls. If the called function is found in the complexity mapping a synthetic Opcode weight gets assigned. A builtin function call is evaluated using its argument and a pre-defined runtime complexity e.g. O(n log n) for sort().

  • The feature is enabled by default
  • The calculation produces a performance overhead and can be disabled by providing a OpcodeMetric instance to the Tracer
  • The AdvancedOpcodeMetric instance assigned to the Tracer provides statistics about how many builtin function calls were observed and how many were found in the complexity map

Bugfixes

  • Cleaning data after normalization introduced wrong data points

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for py-pal, version 1.1.2
Filename, size File type Python version Upload date Hashes
Filename, size py_pal-1.1.2-cp37-cp37m-manylinux1_x86_64.whl (1.1 MB) File type Wheel Python version cp37 Upload date Hashes View
Filename, size py_pal-1.1.2-cp37-cp37m-manylinux2010_x86_64.whl (1.1 MB) File type Wheel Python version cp37 Upload date Hashes View
Filename, size py_pal-1.1.2-cp37-cp37m-manylinux2014_x86_64.whl (1.1 MB) File type Wheel Python version cp37 Upload date Hashes View
Filename, size py_pal-1.1.2-cp38-cp38-manylinux1_x86_64.whl (1.2 MB) File type Wheel Python version cp38 Upload date Hashes View
Filename, size py_pal-1.1.2-cp38-cp38-manylinux2010_x86_64.whl (1.2 MB) File type Wheel Python version cp38 Upload date Hashes View
Filename, size py_pal-1.1.2-cp38-cp38-manylinux2014_x86_64.whl (1.2 MB) File type Wheel Python version cp38 Upload date Hashes View
Filename, size py_pal-1.1.2-cp39-cp39-manylinux1_x86_64.whl (1.2 MB) File type Wheel Python version cp39 Upload date Hashes View
Filename, size py_pal-1.1.2-cp39-cp39-manylinux2010_x86_64.whl (1.2 MB) File type Wheel Python version cp39 Upload date Hashes View
Filename, size py_pal-1.1.2-cp39-cp39-manylinux2014_x86_64.whl (1.1 MB) File type Wheel Python version cp39 Upload date Hashes View
Filename, size py-pal-1.1.2.tar.gz (254.5 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page