Skip to main content

A realtime timer for profiling a Python function or snippet.

Project description

auto_profiler

A realtime timer for profiling a Python function or snippet.

Features

  • Filtering external libraries profiling.
  • Filtering very short time functions-> threshold
  • Allow depth: you can easily find the time consuming function
  • Allow loop or multiple function call
  • Allow recursive function call
  • Disable it globaly by Profiler.GlobalDisable=True to save time :)

Installation

Release version:

$ pip install auto_profiler

Development version:

$ pip install -e git+https://github.com/modaresimr/auto_profiler.git#egg=auto_profiler

Install in Jupyter

$ pip install ipytree
$ jupyter nbextension enable --py --sys-prefix ipytree

Quick start

Jupyter Notebook

Auto profiling

More commonly, chances are that we want to measure the execution time of an entry function and all its subfunctions. In this case, it's too tedious to do it manually, and we can leverage Profiler to inject all the timing points for us automatically:

import time # line number 1
import random

from auto_profiler import Profiler, Tree

def f1():
    mysleep(.6+random.random())

def mysleep(t):
    time.sleep(t)

def fact(i):
    f1()
    if(i==1):
        return 1
    return i*fact(i-1)


@Profiler()
def main():
    for i in range(5):
        f1()

    fact(3)


if __name__ == '__main__':
    main()

Example Output

In Jupyter

example.gif


Time   [Hits * PerHit] Function name [Called from] [function location]
-----------------------------------------------------------------------
8.974s [1 * 8.974]  main  [auto-profiler/profiler.py:267]  [/test/t2.py:30]
├── 5.954s [5 * 1.191]  f1  [/test/t2.py:34]  [/test/t2.py:14]
│   └── 5.954s [5 * 1.191]  mysleep  [/test/t2.py:15]  [/test/t2.py:17]
│       └── 5.954s [5 * 1.191]  <time.sleep>
|
|
|   # The rest is for the example recursive function call fact
└── 3.020s [1 * 3.020]  fact  [/test/t2.py:36]  [/test/t2.py:20]
    ├── 0.849s [1 * 0.849]  f1  [/test/t2.py:21]  [/test/t2.py:14]
    │   └── 0.849s [1 * 0.849]  mysleep  [/test/t2.py:15]  [/test/t2.py:17]
    │       └── 0.849s [1 * 0.849]  <time.sleep>
    └── 2.171s [1 * 2.171]  fact  [/test/t2.py:24]  [/test/t2.py:20]
        ├── 1.552s [1 * 1.552]  f1  [/test/t2.py:21]  [/test/t2.py:14]
        │   └── 1.552s [1 * 1.552]  mysleep  [/test/t2.py:15]  [/test/t2.py:17]
        └── 0.619s [1 * 0.619]  fact  [/test/t2.py:24]  [/test/t2.py:20]
            └── 0.619s [1 * 0.619]  f1  [/test/t2.py:21]  [/test/t2.py:14]

Manual profiling

Sometimes, we only want to measure the execution time of partial snippets or a few functions, then we can inject all timing points into our code manually by leveraging Timer:

# manual_example.py

import time

from auto_profiler import Timer, Tree


def main():
    t = Timer('sleep1', parent_name='main').start()
    time.sleep(1)
    t.stop()

    t = Timer('sleep2', parent_name='main').start()
    time.sleep(1.5)
    t.stop()

    print(Tree(Timer.root))


if __name__ == '__main__':
    main()

Run the example code:

$ python manual_example.py

and it will show you the profiling result:

2.503s  main
├── 1.001s  sleep1
└── 1.501s  sleep2

advanced setup

def show(p):
    print('Time   [Hits * PerHit] Function name [Called from] [Function Location]\n'+\
          '-----------------------------------------------------------------------')
    print(Tree(p.root, threshold=0.5))
    
@Profiler(depth=4, on_disable=show)
def main():
    for i in range(5):
        f1()

    fact(3)

Supported frameworks

While you can do profiling on normal Python code, as a web developer, chances are that you will usually do profiling on web service code.

Currently supported web frameworks:

Examples

For profiling web service code (involving web requests), check out examples.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

auto_profiler-2.0.tar.gz (11.8 kB view details)

Uploaded Source

Built Distribution

auto_profiler-2.0-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file auto_profiler-2.0.tar.gz.

File metadata

  • Download URL: auto_profiler-2.0.tar.gz
  • Upload date:
  • Size: 11.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for auto_profiler-2.0.tar.gz
Algorithm Hash digest
SHA256 985bf3cb28b3fb130bae94fc4de71f3e9da987eac65ffc9c2bdc9018650d3323
MD5 b6914e376b9874bd833bc97c68688f3b
BLAKE2b-256 1d66aeed1b46280650e6c380e5b3b5c1e2de795687892f5654690938cd949c93

See more details on using hashes here.

Provenance

File details

Details for the file auto_profiler-2.0-py3-none-any.whl.

File metadata

  • Download URL: auto_profiler-2.0-py3-none-any.whl
  • Upload date:
  • Size: 12.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for auto_profiler-2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 131305f8190a7ec476cfaa7a585fa6441cfc240328c2806837409155040e2e6c
MD5 c1b956ce158e5b836cc6de165b84f6ea
BLAKE2b-256 d0ff37e6a7b3aa6e547f0e9ff54c73350ba1d46816a4f5f558a52dc0e2d6ab86

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page