Skip to main content

High performance, non blocking profiler for Python web apps.

Project description

image

Profilis

A high performance, non-blocking profiler for Python web applications.

Docs CI PyPI Downloads

Overview

Profilis provides drop-in observability across APIs, functions, and database queries with minimal performance impact. It's designed to be:

  • Non blocking: Async collection with configurable batching and backpressure handling
  • Framework agnostic: Flask, FastAPI, and Sanic with optional ASGI middleware for any ASGI app
  • Database aware: SQLAlchemy (sync & async), MongoDB (PyMongo), Neo4j, and pyodbc
  • Production ready: Configurable sampling, error tracking, and multiple export formats
Screenshot 2025-09-01 at 12 38 50 PM

Star This Repository

If you find Profilis helpful for your projects, please consider giving it a star! It helps others discover this tool and motivates continued development.

GitHub stars

Features

  • Request Profiling: Automatic HTTP request/response timing and status tracking
  • Frameworks: Flask, FastAPI (ASGI middleware), and Sanic with built-in dashboard (Flask blueprint, FastAPI router, Sanic blueprint)
  • Function Profiling: Decorator-based function timing with exception tracking
  • Database Instrumentation: SQLAlchemy (sync & async), MongoDB (PyMongo), Neo4j, pyodbc with query/command monitoring
  • Built-in UI: Real-time dashboard for monitoring and debugging
  • Multiple Exporters: JSONL (with rotation), Console
  • Runtime Context: Distributed tracing with trace/span ID management
  • Configurable Sampling: Control data collection volume (Flask, ASGI, Sanic)

Installation

Install the core package with optional dependencies for your specific needs:

Option 1: Using pip with extras (Recommended)

# Core package only
pip install profilis

# With Flask support
pip install profilis[flask]

# With FastAPI support
pip install profilis[fastapi]

# With Sanic support
pip install profilis[sanic]

# With database support
pip install profilis[flask,sqlalchemy]

# With all integrations
pip install profilis[all]

Option 2: Using requirements files

# Minimal setup (core only)
pip install -r requirements-minimal.txt

# Flask integration
pip install -r requirements-flask.txt

# SQLAlchemy integration
pip install -r requirements-sqlalchemy.txt

# All integrations
pip install -r requirements-all.txt

Option 3: Manual installation

# Core dependencies
pip install typing_extensions>=4.0

# Flask support
pip install flask[async]>=3.0

# FastAPI support
pip install fastapi>=0.110 starlette>=0.37 httpx>=0.24.0

# Sanic support
pip install sanic>=23.0

# SQLAlchemy support
pip install sqlalchemy>=2.0 aiosqlite greenlet

# Performance optimization
pip install orjson>=3.8

Quick Start

Flask Integration

from flask import Flask
from profilis.flask.adapter import ProfilisFlask
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector

# Setup exporter and collector
exporter = JSONLExporter(dir="./logs", rotate_bytes=1024*1024, rotate_secs=3600)
collector = AsyncCollector(exporter, queue_size=2048, batch_max=128, flush_interval=0.1)

# Create Flask app and integrate Profilis
app = Flask(__name__)
profilis = ProfilisFlask(
    app,
    collector=collector,
    exclude_routes=["/health", "/metrics"],
    sample=1.0  # 100% sampling
)

@app.route('/api/users')
def get_users():
    return {"users": ["alice", "bob"]}

# Visit /_profilis for the dashboard (if you mount the UI blueprint)
if __name__ == "__main__":
    app.run(debug=True)

FastAPI Integration

from fastapi import FastAPI
from profilis.fastapi.adapter import instrument_fastapi
from profilis.fastapi.ui import make_ui_router
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector
from profilis.core.emitter import Emitter
from profilis.core.stats import StatsStore

exporter = JSONLExporter(dir="./logs", rotate_bytes=1024*1024, rotate_secs=3600)
collector = AsyncCollector(exporter, queue_size=2048, batch_max=128, flush_interval=0.1)
emitter = Emitter(collector)
stats = StatsStore()

app = FastAPI()
instrument_fastapi(app, emitter, route_excludes=["/profilis"])
app.include_router(make_ui_router(stats, prefix="/profilis"))

@app.get("/api/users")
async def get_users():
    return {"users": ["alice", "bob"]}

# Run with: uvicorn your_module:app --reload
# Visit http://localhost:8000/profilis for the dashboard

Function Profiling

from profilis.decorators.profile import profile_function
from profilis.core.emitter import Emitter
from profilis.exporters.console import ConsoleExporter
from profilis.core.async_collector import AsyncCollector

# Setup profiling
exporter = ConsoleExporter(pretty=True)
collector = AsyncCollector(exporter, queue_size=128, flush_interval=0.2)
emitter = Emitter(collector)

@profile_function(emitter)
def expensive_calculation(n: int) -> int:
    """This function will be automatically profiled."""
    result = sum(i * i for i in range(n))
    return result

@profile_function(emitter)
async def async_operation(data: list) -> list:
    """Async functions are also supported."""
    processed = [item * 2 for item in data]
    return processed

# Use the profiled functions
result = expensive_calculation(1000)

Manual Event Emission

from profilis.core.emitter import Emitter
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector
from profilis.runtime import use_span, span_id

# Setup
exporter = JSONLExporter(dir="./logs")
collector = AsyncCollector(exporter)
emitter = Emitter(collector)

# Create a trace context
with use_span(trace_id=span_id()):
    # Emit custom events
    emitter.emit_req("/api/custom", 200, dur_ns=15000000)  # 15ms
    emitter.emit_fn("custom_function", dur_ns=5000000)      # 5ms
    emitter.emit_db("SELECT * FROM users", dur_ns=8000000, rows=100)

# Close collector to flush remaining events
collector.close()

Built-in Dashboard

Dashboard is available per framework:

  • Flask: make_ui_blueprint(stats, ui_prefix="/_profilis")app.register_blueprint(ui_bp)
  • FastAPI: make_ui_router(stats, prefix="/profilis")app.include_router(router)
  • Sanic: make_ui_blueprint(stats, ui_prefix="/profilis")app.blueprint(bp)
# Example: Flask
from flask import Flask
from profilis.flask.ui import make_ui_blueprint
from profilis.core.stats import StatsStore

app = Flask(__name__)
stats = StatsStore()
ui_bp = make_ui_blueprint(stats, ui_prefix="/_profilis")
app.register_blueprint(ui_bp)
# Visit http://localhost:5000/_profilis

Advanced Usage

Custom Exporters

from profilis.core.async_collector import AsyncCollector
from profilis.exporters.base import BaseExporter

class CustomExporter(BaseExporter):
    def export(self, events: list[dict]) -> None:
        for event in events:
            # Custom export logic
            print(f"Custom export: {event}")

# Use custom exporter
exporter = CustomExporter()
collector = AsyncCollector(exporter)

Runtime Context Management

from profilis.runtime import use_span, span_id, get_trace_id, get_span_id

# Create distributed trace context
with use_span(trace_id="trace-123", span_id="span-456"):
    current_trace = get_trace_id()  # "trace-123"
    current_span = get_span_id()    # "span-456"

    # Nested spans inherit trace context
    with use_span(span_id="span-789"):
        nested_span = get_span_id()  # "span-789"
        parent_trace = get_trace_id() # "trace-123"

Performance Tuning

from profilis.core.async_collector import AsyncCollector

# High-throughput configuration
collector = AsyncCollector(
    exporter,
    queue_size=8192,        # Large queue for high concurrency
    batch_max=256,          # Larger batches for efficiency
    flush_interval=0.05,    # More frequent flushing
    drop_oldest=True        # Drop events under backpressure
)

# Low-latency configuration
collector = AsyncCollector(
    exporter,
    queue_size=512,         # Smaller queue for lower latency
    batch_max=32,           # Smaller batches for faster processing
    flush_interval=0.01,    # Very frequent flushing
    drop_oldest=False       # Don't drop events
)

Configuration

Environment Variables

# Note: Environment variable support is planned for future releases
# Currently, all configuration is done programmatically

Sampling Strategies

# Random sampling
profilis = ProfilisFlask(app, collector=collector, sample=0.1)  # 10% of requests

# Route-based sampling
profilis = ProfilisFlask(
    app,
    collector=collector,
    exclude_routes=["/health", "/metrics", "/static"],
    sample=1.0
)

Exporters

JSONL Exporter

from profilis.exporters.jsonl import JSONLExporter

# With rotation
exporter = JSONLExporter(
    dir="./logs",
    rotate_bytes=1024*1024,  # 1MB per file
    rotate_secs=3600         # Rotate every hour
)

Console Exporter

from profilis.exporters.console import ConsoleExporter

# Pretty-printed output for development
exporter = ConsoleExporter(pretty=True)

# Compact output for production
exporter = ConsoleExporter(pretty=False)

Performance Characteristics

  • Event Creation: ≤15µs per event
  • Memory Overhead: ~100 bytes per event
  • Throughput: 100K+ events/second on modern hardware
  • Latency: Sub-millisecond collection overhead

Documentation

Full documentation is available at: Profilis Docs

Docs are written in Markdown under docs/ and built with MkDocs Material.

Available Documentation

To preview locally:

pip install mkdocs mkdocs-material mkdocs-mermaid2-plugin
mkdocs serve

Development

Setting up the project

  1. Clone and enter the repo

    git clone https://github.com/ankan97dutta/profilis.git
    cd profilis
    
  2. Create a virtual environment and install in editable mode with dev dependencies

    python -m venv .venv
    source .venv/bin/activate   # Windows: .venv\Scripts\activate
    pip install -e ".[dev]"
    
  3. Install pre-commit hooks (optional but recommended)

    pre-commit install
    
  4. Run the test suite

    pytest
    

    Use pytest -v for verbose output, pytest path/to/test_file.py to run a single file, or pytest -k "test_name" to run tests matching a pattern. Coverage: pytest --cov=profilis --cov-report=term-missing.

Working with TDD

We encourage test-driven development (TDD):

  1. Red — Write a failing test that describes the behaviour you want.
  2. Green — Implement the minimum code to make the test pass.
  3. Refactor — Improve the implementation while keeping tests green.

Run tests frequently (e.g. pytest or pytest tests/ -q) as you work. See Development Guidelines for the full TDD workflow and test layout.

Branching and commits

Roadmap

See Profilis – v0 Roadmap Project and docs/overview/roadmap.md.

License

MIT

Contact

Feel free to reach out if you have questions, suggestions, or would like to contribute to Profilis!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

profilis-0.4.0.tar.gz (65.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

profilis-0.4.0-py3-none-any.whl (52.5 kB view details)

Uploaded Python 3

File details

Details for the file profilis-0.4.0.tar.gz.

File metadata

  • Download URL: profilis-0.4.0.tar.gz
  • Upload date:
  • Size: 65.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for profilis-0.4.0.tar.gz
Algorithm Hash digest
SHA256 685d49375cf2f2c48fd80c5f116f6e3f4354854cc474c7a7f251fcce9c4e3275
MD5 c0ddc09c6bacd6b8bde9319c3361ed11
BLAKE2b-256 0d329eb9471af58594050b9d0f923a2e8db1873b5ce0ef9781756c796504fda0

See more details on using hashes here.

File details

Details for the file profilis-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: profilis-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 52.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for profilis-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 dfd86ca926bcbde5e15d23046096f1cca561cd5bc33e73945b9f61051f63f007
MD5 49494d43ae6f7d0720774650b2d4e7c0
BLAKE2b-256 dd490db1cb1f702ea5c65106501a00fb74dd65a58576559fc4c280ba6df2239d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page