Skip to main content

A powerful timing and performance measurement library for Python applications

Project description

TimingStack

A simple yet powerful timing library for Python that helps you measure and analyze code performance with nested timer support.

Why TimingStack?

I built TimingStack because I needed something more than just basic timing functions. I wanted to:

  • Track nested operations and see how they relate to each other
  • Get aggregated statistics across multiple runs
  • Use timers in different ways (decorators, context managers, manual)
  • Handle async/await code gracefully
  • Keep memory usage under control with bounded collections

Features

  • Multiple usage patterns: Decorator, context manager, or manual start/stop
  • Async/await support: Works with both sync and async code
  • Nested timers: Track parent-child relationships between operations
  • Aggregated statistics: See counts, totals, averages, min/max for each timer
  • Memory safe: Uses bounded lists to prevent memory leaks
  • Configurable: Adjust time units, precision, and error handling
  • Global disable: Turn off all timing for zero production overhead
  • Thread-safe: Works in multi-threaded environments

Installation

pip install timingstack

Quick Start

As a decorator (most common)

from timingstack import Timer

@Timer.measure
def process_data():
    # your code here
    time.sleep(0.1)
    return "processed"

result = process_data()
Timer.print_report()

Custom timer names

You can specify custom names for your timers:

# Using Timer.measure with custom name
@Timer.measure("data_processing")
def process_data():
    return "processed"

# Using Timer constructor directly
@Timer("user_authentication")
def authenticate_user():
    return authenticate()

# All three styles work the same
@Timer.measure                # Uses function name: "my_function"
def my_function():
    pass

@Timer.measure()              # Uses function name: "my_function"
def my_function():
    pass

@Timer.measure("custom_name") # Uses custom name: "custom_name"
def my_function():
    pass

As a context manager

with Timer("database_query"):
    # your code here
    results = db.query("SELECT * FROM users")

With async code

@Timer.measure("api_fetch")
async def fetch_data():
    async with Timer("http_request"):
        response = await httpx.get("https://api.example.com")
    return response.json()

result = await fetch_data()

Manual start/stop

Timer.start("complex_operation")
# do some work
Timer.end("complex_operation")

Nested Timers

This is where TimingStack shines:

@Timer.measure("order_processing")
def process_order():
    with Timer("validate_order"):
        validate(order_data)

    with Timer("calculate_price"):
        price = calculate_price(order_data)

    with Timer("save_to_database"):
        with Timer("create_connection"):
            conn = db.connect()

        with Timer("execute_query"):
            db.execute(conn, query)

When you run this, you'll see how long each step took AND how much time was spent in child operations.

Statistics and Reporting

Get a statistical report with:

Timer.print_report()

Configuration

Customize how TimingStack behaves:

from timingstack import configure, ErrorHandling

# Show times in seconds with 6 decimal places
configure(time_unit="seconds", precision=6)

# Change error handling
configure(on_mismatch=ErrorHandling.RAISE)  # or WARN, IGNORE

# Adjust memory limits
configure(max_length=1000)  # Keep only 1000 root timers

# Disable timers (useful for production)
configure(enabled=False)    # or True to re-enable

Enable/Disable Timers

Perfect for production environments where you want zero overhead:

from timingstack import configure, Timer

# Disable all timing (no performance overhead)
configure(enabled=False)

@Timer.measure  # This decorator does nothing now
def production_function():
    return expensive_operation()

with Timer("this_is_ignored"):  # This context manager does nothing
    do_something()

# Re-enable for debugging
configure(enabled=True)

Error Handling

TimingStack can handle timer mismatches in three ways:

from timingstack import configure, ErrorHandling

# Warn about mismatches (default)
configure(on_mismatch=ErrorHandling.WARN)

# Raise exceptions for mismatches
configure(on_mismatch=ErrorHandling.RAISE)

# Ignore mismatches silently
configure(on_mismatch=ErrorHandling.IGNORE)

Memory Management

TimingStack uses bounded lists to prevent memory leaks:

# Keep only the last 100 root timers
configure(max_length=100)

# When the limit is reached, old timers are automatically removed
# and a warning is logged

Advanced Usage

Multiple Timer Stacks

Sometimes you need separate timing contexts:

from timingstack import TimerStack

web_stack = TimerStack()
bg_stack = TimerStack()

# Use them independently
web_stack.start("http_request")
bg_stack.start("file_processing")

# Get separate reports
web_stack.print_report()
bg_stack.print_report()

Timer Counts

See how many times each timer was called:

stack = TimerStack()
# ... run some timers ...

counts = stack.get_timer_counts()
# {'http_request': 15, 'database_query': 8, 'validation': 45}

Examples

Check out the example.py file for comprehensive examples showing all the features.

Common Patterns

Performance Profiling

def profile_function():
    configure(time_unit="milliseconds", precision=3)

    @Timer.measure
    def function_to_profile():
        # your function code
        pass

    # Run it multiple times to get good stats
    for _ in range(100):
        function_to_profile()

    Timer.print_report()

Database Operations

def save_user(user_data):
    with Timer("save_user"):
        with Timer("validate"):
            validate_user_data(user_data)

        with Timer("hash_password"):
            hashed = hash_password(user_data.password)

        with Timer("db_insert"):
            db.execute("INSERT INTO users ...", user_data)

API Endpoints

@Timer.measure("user_endpoint")
@app.route("/api/users/<int:user_id>")
def get_user(user_id):
    with Timer("database_lookup"):
        user = db.get_user(user_id)

    with Timer("serialize_response"):
        return jsonify(user.to_dict())

That's It!

TimingStack is designed to be simple to use but powerful enough for real-world performance analysis. Start with decorators, then explore nested timers and statistics as you need them.

If you run into issues or have ideas for improvements, feel free to open an issue or submit a pull request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

timingstack-0.1.3.tar.gz (34.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

timingstack-0.1.3-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file timingstack-0.1.3.tar.gz.

File metadata

  • Download URL: timingstack-0.1.3.tar.gz
  • Upload date:
  • Size: 34.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.9.0

File hashes

Hashes for timingstack-0.1.3.tar.gz
Algorithm Hash digest
SHA256 af8f6e64495d172733fb1d108027600a4578b9c42f6c5e7e0fa092a5940718ca
MD5 ea2fca6a2a18220d3419786da95d0233
BLAKE2b-256 b74d8974768c78397552f7fbd18c73f072490ddad05c34acd62eb3c0e5bbc0b4

See more details on using hashes here.

File details

Details for the file timingstack-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for timingstack-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3cc2ff3bb2d96491ba5f8c828cb00296f342dc491af39f966a68ecb5b7fed5b4
MD5 5cdd4076125b6a68b6dbae5518cc68ec
BLAKE2b-256 6ad623d14c9542510b493acd770dcbaaa31df60ebfa22aadf0d95a9b09b32b19

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page