Skip to main content

Developer-friendly logging helpers for Azure Functions Python

Project description

azure-functions-logging

PyPI Python Version CI Release Security Scans codecov pre-commit Docs License: MIT

Read this in: 한국어 | 日本語 | 简体中文

Invocation-aware observability for Azure Functions Python v2. Surfaces invocation_id, detects cold starts, warns on host.json misconfig, and outputs Application Insights-ready structured logs — without replacing Python's standard logging.


Part of the Azure Functions Python DX Toolkit → Bring FastAPI-like developer experience to Azure Functions

Why this exists

Azure Functions Python logging has specific failure modes that generic logging libraries don't address:

Problem What happens This library
host.json log level conflict Your INFO logs silently disappear in Azure Detects and warns at startup
No invocation_id in logs Impossible to correlate logs to a specific execution Auto-injects from context object
Cold start invisible No signal when a new worker instance starts Detects automatically on first inject_context()
Noisy third-party loggers azure-core, urllib3 flood your Application Insights SamplingFilter / RedactionFilter
Local vs cloud output mismatch Colorized output breaks in production pipelines Environment-aware formatter switching
PII leaking into logs Sensitive fields logged in exception tracebacks RedactionFilter with pattern matching

What it does

  • Invocation context — auto-injects invocation_id, function_name, cold_start into every log
  • Structured JSON output — Application Insights-ready NDJSON format for production
  • Noise controlSamplingFilter rate-limits chatty third-party loggers
  • PII protectionRedactionFilter masks sensitive fields before they reach log aggregation

Before / After

Without azure-functions-logging — plain print() output, no context, no structure:

Before: plain print output with no context

With azure-functions-logging — colorized local dev output and production-ready JSON:

After: structured color output and JSON

Installation

pip install azure-functions-logging

Quick Start

import azure.functions as func
from azure_functions_logging import get_logger, inject_context, setup_logging

setup_logging()
logger = get_logger(__name__)

app = func.FunctionApp()

@app.route(route="hello")
def hello(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
    inject_context(context)  # binds invocation_id, function_name, cold_start

    logger.info("Request received")
    # {"level": "INFO", "invocation_id": "abc-123", "cold_start": true, ...}

    return func.HttpResponse("OK")

Invocation Context

inject_context(context) should be the first line of every handler. It binds:

  • invocation_id — unique per execution, correlates all logs for one request
  • function_name — the Azure Functions function name
  • trace_id — trace context from the platform
  • cold_startTrue on first invocation of this worker process
def my_function(req, context):
    inject_context(context)
    logger.info("handler started")
    # every log from here carries invocation_id and cold_start

Without inject_context(), these fields are None in every log line.

with_context Decorator

For less boilerplate, use the with_context decorator instead of calling inject_context() manually:

import azure.functions as func
from azure_functions_logging import get_logger, setup_logging, with_context

setup_logging()
logger = get_logger(__name__)

app = func.FunctionApp()

@app.route(route="hello")
@with_context
def hello(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
    logger.info("Request received")
    return func.HttpResponse("OK")

The decorator finds the context parameter by name, calls inject_context() before your handler runs, and resets context variables in finally after it returns.

Custom parameter name:

@with_context(param="ctx")
def hello(req: func.HttpRequest, ctx: func.Context) -> func.HttpResponse:
    ...

Both sync and async handlers are supported.

Structured JSON Output (Production)

Use JSON format when logs feed Application Insights or any aggregation system:

setup_logging(format="json")

Output per log line (NDJSON — one JSON object per line):

{"timestamp": "2024-01-15T10:30:00Z", "level": "INFO", "logger": "my_module",
 "message": "order accepted", "invocation_id": "abc-123", "function_name": "OrderHandler",
 "cold_start": false, "trace_id": "00-abc...", "exception": null,
 "extra": {"order_id": "o-999"}}

Extra fields appear in extra and are indexable in Application Insights:

logger.info("order accepted", order_id="o-999", tenant_id="t-1")

host.json Conflict Detection

If your host.json suppresses log levels that your app emits, you get this warning at startup:

WARNING: host.json logLevel.default is 'Warning'. Logs below WARNING will be suppressed in Azure.

Recommended host.json baseline:

{
  "version": "2.0",
  "logging": {
    "logLevel": {
      "default": "Information",
      "Function": "Information"
    }
  }
}

Noise Control

Suppress chatty third-party loggers without removing them:

from azure_functions_logging import SamplingFilter, setup_logging
import logging

setup_logging()

# Only log 1 in 10 azure-core messages
logging.getLogger("azure").addFilter(SamplingFilter(rate=0.1))

# Silence urllib3 completely in production
logging.getLogger("urllib3").setLevel(logging.WARNING)

PII Redaction

Strip sensitive fields before they reach Application Insights:

from azure_functions_logging import RedactionFilter, setup_logging
import logging

setup_logging()
root = logging.getLogger()
root.addFilter(RedactionFilter(patterns=["password", "token", "secret"]))

Any log record where the message or extra fields match a pattern will have those values replaced with [REDACTED].

Local vs Cloud

Environment Format Behavior
Local terminal color (default) Colorized [TIME] [LEVEL] [LOGGER] message
Azure / Core Tools json NDJSON, no ANSI codes, host-managed handlers
CI / pipeline json NDJSON, machine-parseable

setup_logging() detects FUNCTIONS_WORKER_RUNTIME and WEBSITE_INSTANCE_ID to choose the right path automatically. In Azure, it installs context filters without adding handlers (avoids duplicate output from the host pipeline).

Context Binding

Attach request-scoped metadata to every log without passing it through every call:

def process_order(order_id: str) -> None:
    order_logger = logger.bind(order_id=order_id, region="eastus")
    order_logger.info("processing started")   # includes order_id + region
    order_logger.info("processing complete")  # same metadata, new message

Create bound loggers per-invocation. Do not cache them at module level.

When to use

  • You need structured, queryable logs in Application Insights
  • You want invocation_id correlation across all logs for a single request
  • You need cold start detection without custom instrumentation
  • You want PII redaction or noise control for third-party loggers
  • Your host.json config silently suppresses logs and you don't know why

Documentation

Ecosystem

Part of the Azure Functions Python DX Toolkit:

Package Role
azure-functions-validation Request and response validation
azure-functions-openapi OpenAPI spec and Swagger UI
azure-functions-logging Structured logging and observability
azure-functions-doctor Pre-deploy diagnostic CLI
azure-functions-scaffold Project scaffolding
azure-functions-python-cookbook Recipes and examples

Disclaimer

This project is an independent community project and is not affiliated with, endorsed by, or maintained by Microsoft.

Azure and Azure Functions are trademarks of Microsoft Corporation.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure_functions_logging-0.4.1.tar.gz (202.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

azure_functions_logging-0.4.1-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file azure_functions_logging-0.4.1.tar.gz.

File metadata

  • Download URL: azure_functions_logging-0.4.1.tar.gz
  • Upload date:
  • Size: 202.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for azure_functions_logging-0.4.1.tar.gz
Algorithm Hash digest
SHA256 4953409e928fe6f6fee92d8310f8498b69d3da4deef64ce0ecca2443d0869aac
MD5 5a679060c914c554f778923bf1670894
BLAKE2b-256 07da9194bb8db4f877f2af1f7349d89aa9de59bf66663540842d107846e816a5

See more details on using hashes here.

Provenance

The following attestation bundles were made for azure_functions_logging-0.4.1.tar.gz:

Publisher: publish-pypi.yml on yeongseon/azure-functions-logging

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file azure_functions_logging-0.4.1-py3-none-any.whl.

File metadata

File hashes

Hashes for azure_functions_logging-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7144c568bf1b9070126cc92e262a95413d6f5d0906b1aafb05933c2318a76b3a
MD5 5da20348379f6cde75d66fe206bb4c78
BLAKE2b-256 15c0fe7b1dcee1230f9f2ee24c216df39c9a586fcd6d7a37727df1a310793af5

See more details on using hashes here.

Provenance

The following attestation bundles were made for azure_functions_logging-0.4.1-py3-none-any.whl:

Publisher: publish-pypi.yml on yeongseon/azure-functions-logging

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page