Skip to main content

High performance, non blocking profiler for Python web apps.

Project description

image

Profilis

A high-performance, non-blocking profiler for Python web applications.

Docs CI PyPI Downloads

Overview

Profilis gives you drop-in observability across APIs, functions, and database queries with minimal performance impact.

  • Non-blocking: async collection with configurable batching and backpressure handling
  • Framework-agnostic: Flask, FastAPI, and Sanic (plus optional ASGI middleware for any ASGI app)
  • Database-aware: SQLAlchemy (sync and async), MongoDB (PyMongo), Neo4j, and pyodbc
  • Production-ready: configurable sampling, error tracking, and export formats
Screenshot 2025-09-01 at 12 38 50 PM

TL;DR

Install:

pip install profilis

Pick a framework integration:

  • Flask: pip install profilis[flask]
  • FastAPI: pip install profilis[fastapi]
  • Sanic: pip install profilis[sanic]
  • Prometheus exporter: pip install profilis[prometheus]
  • Performance extras: pip install profilis[perf]
  • All integrations: pip install profilis[all]

Then jump to Quick Start.

Contents

Features

  • Request profiling: automatic HTTP request/response timing and status tracking
  • Frameworks: Flask, FastAPI (ASGI middleware), and Sanic, with a built-in dashboard (Flask blueprint, FastAPI router, Sanic blueprint)
  • Function profiling: decorator-based function timing with exception tracking
  • Database instrumentation: SQLAlchemy (sync and async), MongoDB (PyMongo), Neo4j, pyodbc with query/command monitoring
  • Built-in UI: real-time dashboard for monitoring and debugging
  • Exporters: JSONL (with rotation), Console (Prometheus is supported; see docs)
  • Runtime context: trace/span ID management
  • Configurable sampling: control data collection volume (Flask, ASGI, Sanic)

Installation

Install the core package with optional dependencies for your specific needs:

Option 1: Using pip with extras (Recommended)

# Core package only
pip install profilis

# With Flask support
pip install profilis[flask]

# With FastAPI support
pip install profilis[fastapi]

# With Sanic support
pip install profilis[sanic]

# With database support
pip install profilis[flask,sqlalchemy]

# With all integrations
pip install profilis[all]

Option 2: Using requirements files

# Minimal setup (core only)
pip install -r requirements-minimal.txt

# Flask integration
pip install -r requirements-flask.txt

# FastAPI integration
pip install -r requirements-fastapi.txt

# SQLAlchemy integration
pip install -r requirements-sqlalchemy.txt

# MongoDB integration
pip install -r requirements-mongo.txt

# All integrations
pip install -r requirements-all.txt

Option 3: Fine-grained installs

If you need fully explicit dependency control, install your framework/DB libs directly and only install the Profilis extras you need. The authoritative list of extras lives in pyproject.toml under [project.optional-dependencies].

Quick Start

Core concepts (one-minute mental model)

  • Collector: AsyncCollector buffers events off the request hot path and flushes them in batches.
  • Emitter: Emitter creates tiny dict events (REQ/FN/DB) and enqueues them to a collector.
  • Exporter (sink): a callable that consumes batches of events (e.g. JSONL, Console, Prometheus).
  • UI: a small HTTP dashboard that reads from a StatsStore (which you populate).

Flask Integration

from flask import Flask
from profilis.flask.adapter import ProfilisFlask
from profilis.flask.ui import make_ui_blueprint
from profilis.core.stats import StatsStore
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector

stats = StatsStore()
jsonl = JSONLExporter(dir="./logs", rotate_bytes=1024*1024, rotate_secs=3600)

def sink(batch: list[dict]) -> None:
    # Persist events
    jsonl(batch)
    # Feed the UI stats store (record request timing + errors)
    for ev in batch:
        if ev.get("kind") == "REQ":
            status = int(ev.get("status", 0) or 0)
            stats.record(int(ev.get("dur_ns", 0) or 0), error=status >= 500)

collector = AsyncCollector(sink, queue_size=2048, batch_max=128, flush_interval=0.1)

app = Flask(__name__)
ProfilisFlask(
    app,
    collector=collector,
    exclude_routes=["/health", "/metrics"],
    sample=1.0  # 100% sampling
)

app.register_blueprint(make_ui_blueprint(stats, ui_prefix="/_profilis"))

@app.route('/api/users')
def get_users():
    return {"users": ["alice", "bob"]}

# Visit http://localhost:5000/_profilis
if __name__ == "__main__":
    app.run(debug=True)

FastAPI Integration

from fastapi import FastAPI
from profilis.fastapi.adapter import instrument_fastapi
from profilis.fastapi.ui import make_ui_router
from profilis.core.stats import StatsStore
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector
from profilis.core.emitter import Emitter

stats = StatsStore()
jsonl = JSONLExporter(dir="./logs", rotate_bytes=1024*1024, rotate_secs=3600)

def sink(batch: list[dict]) -> None:
    jsonl(batch)
    for ev in batch:
        # FastAPI/ASGI emits kind="HTTP"
        if ev.get("kind") == "HTTP":
            status = int(ev.get("status", 0) or 0)
            stats.record(int(ev.get("dur_ns", 0) or 0), error=status >= 500)

collector = AsyncCollector(sink, queue_size=2048, batch_max=128, flush_interval=0.1)
emitter = Emitter(collector)

app = FastAPI()
instrument_fastapi(app, emitter, route_excludes=["/profilis"])
app.include_router(make_ui_router(stats, prefix="/profilis"))

@app.get("/api/users")
async def get_users():
    return {"users": ["alice", "bob"]}

# Run with: uvicorn your_module:app --reload
# Visit http://localhost:8000/profilis for the dashboard

Sanic Integration

from sanic import Sanic
from profilis.sanic.adapter import SanicConfig, instrument_sanic_app
from profilis.sanic.ui import make_ui_blueprint
from profilis.core.async_collector import AsyncCollector
from profilis.core.emitter import Emitter
from profilis.core.stats import StatsStore
from profilis.exporters.console import ConsoleExporter

app = Sanic("app")
stats = StatsStore()
console = ConsoleExporter(pretty=True)

def sink(batch: list[dict]) -> None:
    console(batch)
    for ev in batch:
        if ev.get("kind") == "HTTP":
            status = int(ev.get("status", 0) or 0)
            stats.record(int(ev.get("dur_ns", 0) or 0), error=status >= 500)

collector = AsyncCollector(sink)
emitter = Emitter(collector)

instrument_sanic_app(app, emitter, SanicConfig(route_excludes=["/profilis"]))
app.blueprint(make_ui_blueprint(stats, ui_prefix="/profilis"))

Function Profiling

from profilis.decorators.profile import profile_function
from profilis.core.emitter import Emitter
from profilis.exporters.console import ConsoleExporter
from profilis.core.async_collector import AsyncCollector

# Setup profiling
exporter = ConsoleExporter(pretty=True)
collector = AsyncCollector(exporter, queue_size=128, flush_interval=0.2)
emitter = Emitter(collector)

@profile_function(emitter)
def expensive_calculation(n: int) -> int:
    """This function will be automatically profiled."""
    result = sum(i * i for i in range(n))
    return result

@profile_function(emitter)
async def async_operation(data: list) -> list:
    """Async functions are also supported."""
    processed = [item * 2 for item in data]
    return processed

# Use the profiled functions
result = expensive_calculation(1000)

Manual Event Emission

from profilis.core.emitter import Emitter
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector
from profilis.runtime import use_span, span_id

# Setup
exporter = JSONLExporter(dir="./logs")
collector = AsyncCollector(exporter)
emitter = Emitter(collector)

# Create a trace context
with use_span(trace_id=span_id()):
    # Emit custom events
    emitter.emit_req("/api/custom", 200, dur_ns=15000000)  # 15ms
    emitter.emit_fn("custom_function", dur_ns=5000000)      # 5ms
    emitter.emit_db("SELECT * FROM users", dur_ns=8000000, rows=100)

# Close collector to flush remaining events
collector.close()

Built-in Dashboard

Dashboard is available per framework:

  • Flask: make_ui_blueprint(stats, ui_prefix="/_profilis")app.register_blueprint(ui_bp)
  • FastAPI: make_ui_router(stats, prefix="/profilis")app.include_router(router)
  • Sanic: make_ui_blueprint(stats, ui_prefix="/profilis")app.blueprint(bp)
# Example: Flask
from flask import Flask
from profilis.flask.ui import make_ui_blueprint
from profilis.core.stats import StatsStore

app = Flask(__name__)
stats = StatsStore()
ui_bp = make_ui_blueprint(stats, ui_prefix="/_profilis")
app.register_blueprint(ui_bp)
# Visit http://localhost:5000/_profilis

Advanced Usage

Custom Exporters

from profilis.core.async_collector import AsyncCollector
from profilis.exporters.base import BaseExporter

class CustomExporter(BaseExporter):
    def export(self, events: list[dict]) -> None:
        for event in events:
            # Custom export logic
            print(f"Custom export: {event}")

# Use custom exporter
exporter = CustomExporter()
collector = AsyncCollector(exporter)

Runtime Context Management

from profilis.runtime import use_span, span_id, get_trace_id, get_span_id

# Create distributed trace context
with use_span(trace_id="trace-123", span_id="span-456"):
    current_trace = get_trace_id()  # "trace-123"
    current_span = get_span_id()    # "span-456"

    # Nested spans inherit trace context
    with use_span(span_id="span-789"):
        nested_span = get_span_id()  # "span-789"
        parent_trace = get_trace_id() # "trace-123"

Performance Tuning

from profilis.core.async_collector import AsyncCollector

# High-throughput configuration
collector = AsyncCollector(
    exporter,
    queue_size=8192,        # Large queue for high concurrency
    batch_max=256,          # Larger batches for efficiency
    flush_interval=0.05,    # More frequent flushing
    drop_oldest=True        # Drop events under backpressure
)

# Low-latency configuration
collector = AsyncCollector(
    exporter,
    queue_size=512,         # Smaller queue for lower latency
    batch_max=32,           # Smaller batches for faster processing
    flush_interval=0.01,    # Very frequent flushing
    drop_oldest=False       # Don't drop events
)

Configuration

Environment Variables

# Note: Environment variable support is planned for future releases
# Currently, all configuration is done programmatically

Sampling Strategies

# Random sampling
profilis = ProfilisFlask(app, collector=collector, sample=0.1)  # 10% of requests

# Route-based sampling
profilis = ProfilisFlask(
    app,
    collector=collector,
    exclude_routes=["/health", "/metrics", "/static"],
    sample=1.0
)

Exporters

JSONL Exporter

from profilis.exporters.jsonl import JSONLExporter

# With rotation
exporter = JSONLExporter(
    dir="./logs",
    rotate_bytes=1024*1024,  # 1MB per file
    rotate_secs=3600         # Rotate every hour
)

Console Exporter

from profilis.exporters.console import ConsoleExporter

# Pretty-printed output for development
exporter = ConsoleExporter(pretty=True)

# Compact output for production
exporter = ConsoleExporter(pretty=False)

Performance Characteristics

  • Event Creation: ≤15µs per event
  • Memory Overhead: ~100 bytes per event
  • Throughput: 100K+ events/second on modern hardware
  • Latency: Sub-millisecond collection overhead

Documentation

Full documentation is available at: Profilis Docs

Docs are written in Markdown under docs/ and built with MkDocs Material.

Available Documentation

To preview locally:

pip install mkdocs mkdocs-material mkdocs-mermaid2-plugin
mkdocs serve

Development

Setting up the project

  1. Clone and enter the repo

    git clone https://github.com/ankan97dutta/profilis.git
    cd profilis
    
  2. Create a virtual environment and install in editable mode with dev dependencies

    python -m venv .venv
    source .venv/bin/activate   # Windows: .venv\Scripts\activate
    pip install -e ".[dev]"
    
  3. Install pre-commit hooks (optional but recommended)

    pre-commit install
    
  4. Run the test suite

    pytest
    

    Use pytest -v for verbose output, pytest path/to/test_file.py to run a single file, or pytest -k "test_name" to run tests matching a pattern. Coverage: pytest --cov=profilis --cov-report=term-missing.

Working with TDD

We encourage test-driven development (TDD):

  1. Red — Write a failing test that describes the behaviour you want.
  2. Green — Implement the minimum code to make the test pass.
  3. Refactor — Improve the implementation while keeping tests green.

Run tests frequently (e.g. pytest or pytest tests/ -q) as you work. See Development Guidelines for the full TDD workflow and test layout.

Branching and commits

Roadmap

See Profilis – v0 Roadmap Project and docs/overview/roadmap.md.

License

MIT

Contact

Feel free to reach out if you have questions, suggestions, or would like to contribute to Profilis!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

profilis-1.0.0.tar.gz (67.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

profilis-1.0.0-py3-none-any.whl (55.3 kB view details)

Uploaded Python 3

File details

Details for the file profilis-1.0.0.tar.gz.

File metadata

  • Download URL: profilis-1.0.0.tar.gz
  • Upload date:
  • Size: 67.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for profilis-1.0.0.tar.gz
Algorithm Hash digest
SHA256 4d382e896fd1767c79d90c609561cec07490e8a51afa8a488aec9f3ea431d93b
MD5 380ea4d965c7230a18f129e48947dc5e
BLAKE2b-256 4825e855444ec059a9daa50d1f63b476feeab868d2932f10f0f5bac756df515e

See more details on using hashes here.

File details

Details for the file profilis-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: profilis-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 55.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for profilis-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cb34a298780d1a22bfc0746ebaa6e71096662688aaabb103565ec8d7fee49cca
MD5 7ce5667fea04628da27bc581a429270a
BLAKE2b-256 66bf2ecb5f68ed0f502ce83fa0a301d59f677cd5b74c1f9fb829137c7008af8e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page