Developer-friendly logging helpers for Azure Functions Python
Project description
Azure Functions Logging
Part of the Azure Functions Python DX Toolkit — dogfood-tested by azure-functions-cookbook-python.
Read this in: 한국어 | 日本語 | 简体中文
Invocation-aware observability for Azure Functions Python v2.
Surfaces invocation_id, detects cold starts, warns on host.json misconfig, and outputs Application Insights-ready structured logs — without replacing Python's standard logging.
Part of the Azure Functions Python DX Toolkit → Bring FastAPI-like developer experience to Azure Functions
Why this exists
Azure Functions Python logging has specific failure modes that generic logging libraries don't address:
| Problem | What happens | This library |
|---|---|---|
host.json log level conflict |
Your INFO logs silently disappear in Azure |
Detects and warns at startup |
No invocation_id in logs |
Impossible to correlate logs to a specific execution | Auto-injects from context object |
| Cold start invisible | No signal when a new worker instance starts | Detects automatically on first inject_context() |
| Noisy third-party loggers | azure-core, urllib3 flood your Application Insights |
SamplingFilter / RedactionFilter |
| Local vs cloud output mismatch | Colorized output breaks in production pipelines | Environment-aware formatter switching |
| PII leaking into logs | Sensitive values accidentally logged as extra fields | RedactionFilter with key-based redaction |
What it does
- Invocation context — auto-injects
invocation_id,function_name,cold_startinto every log - Structured JSON output — Application Insights-ready NDJSON format for production
- Noise control —
SamplingFilterrate-limits chatty third-party loggers - PII protection —
RedactionFiltermasks sensitive fields before they reach log aggregation
Scope disclaimer. This package writes structured JSON to Python
logging/ stdout. How those fields appear in Application Insights depends on the Azure Functions host, worker, logging configuration, and ingestion pipeline. The library does not own ingestion or schema mapping — bothcustomDimensions-parsed and raw-messageshapes are valid in production.
Before / After
Without azure-functions-logging — plain print() output, no context, no structure:
import azure.functions as func
app = func.FunctionApp()
@app.route(route="orders")
def process_order(req: func.HttpRequest) -> func.HttpResponse:
print("Processing order") # no invocation_id, no structure
print(f"Order: {req.get_json()}") # PII may leak, no log level
return func.HttpResponse("OK")
Terminal output:
Processing order
Order: {'customer': 'Alice', 'total': 99.99}
No invocation ID. No log level. Hard to correlate in Application Insights.
With azure-functions-logging — structured, queryable, production-ready:
import azure.functions as func
from azure_functions_logging import JsonFormatter, get_logger, logging_context, setup_logging
setup_logging(functions_formatter=JsonFormatter())
logger = get_logger(__name__)
app = func.FunctionApp()
@app.route(route="orders")
def process_order(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
with logging_context(context):
logger.info("Processing order", order_id="o-999")
return func.HttpResponse("OK")
Local terminal output when run standalone (e.g. python app.py, color formatter):
10:30:00 INFO function_app Processing order [invocation_id=abc-123-def, function_name=process_order, cold_start=true]
Production output under func start / Azure (Application Insights NDJSON, applied because functions_formatter is set):
{"timestamp": "2024-01-15T10:30:00+00:00", "level": "INFO", "logger": "function_app",
"message": "Processing order", "invocation_id": "abc-123-def",
"function_name": "process_order", "trace_id": null, "cold_start": true,
"exception": null, "extra": {"order_id": "o-999"}}
Every log carries
invocation_idandcold_start. Queryable in Application Insights. Zeroprint()statements.
Note: The exact Application Insights schema depends on your ingestion pipeline. In some deployments JSON fields are parsed into
customDimensions; in others the JSON stays inside themessagecolumn. Examples for both shapes are below.
Query in Application Insights
When JSON fields are parsed into customDimensions
traces
| where customDimensions.invocation_id == "abc-123-def"
| project timestamp, message, customDimensions.cold_start, customDimensions.function_name
| order by timestamp asc
Find all cold starts in the last hour:
traces
| where customDimensions.cold_start == "true"
| where timestamp > ago(1h)
| summarize count() by bin(timestamp, 5m)
When JSON remains in the message column
traces
| extend payload = parse_json(message)
| where tostring(payload.invocation_id) == "abc-123-def"
| project timestamp, tostring(payload.message), tostring(payload.cold_start), tostring(payload.function_name)
| order by timestamp asc
Find all cold starts in the last hour:
traces
| extend payload = parse_json(message)
| where tostring(payload.cold_start) == "true"
| where timestamp > ago(1h)
| summarize count() by bin(timestamp, 5m)
What this package does not do
This package does not own:
- Replacing stdlib logging — it wraps and enriches Python's standard
logging, never replaces it - Distributed tracing — use OpenTelemetry or Application Insights SDK for end-to-end trace correlation
- API documentation — use
azure-functions-openapifor API documentation and spec generation
Installation
pip install azure-functions-logging
Quick Start
import azure.functions as func
from azure_functions_logging import get_logger, logging_context, setup_logging
setup_logging()
logger = get_logger(__name__)
app = func.FunctionApp()
@app.route(route="hello")
def hello(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
with logging_context(context): # binds invocation_id, function_name, cold_start; restores previous context on exit
logger.info("Request received")
# log record now carries invocation_id, function_name, cold_start
return func.HttpResponse("OK")
logging_context is the recommended primary pattern: it injects context on enter and always restores the previous context on exit (even when the handler raises), which prevents stale context from leaking into the next invocation on a reused worker.
For lower-level control or when integrating with custom middleware, use token-based restore:
from azure_functions_logging import inject_context, restore_context
# Assumes `logger` and `context` are in scope (see Quick Start).
tokens = inject_context(context)
try:
logger.info("Request received")
finally:
restore_context(tokens)
Use reset_context() only when you intentionally want to clear all context (e.g. test teardown).
Start the Functions host locally (using the e2e example app):
func start --script-root examples/e2e_app
Verify locally and on Azure
After deploying (see docs/deployment.md), the same request produces the same response in both environments.
Local
curl -s http://localhost:7071/api/logme?correlation_id=demo-123
{"logged": true, "correlation_id": "demo-123"}
Azure
curl -s "https://<your-app>.azurewebsites.net/api/logme?correlation_id=demo-123"
{"logged": true, "correlation_id": "demo-123"}
Verified against a temporary Azure Functions deployment in koreacentral (Python 3.12, Consumption plan). Response captured and URL anonymized.
Invocation Context
Use logging_context() to bind invocation context for the duration of a handler. It sets:
invocation_id— unique per execution, correlates all logs for one requestfunction_name— the Azure Functions function nametrace_id— trace context from the platform; extracted only from valid W3Ctraceparentheaders (strict validation, invalid values are ignored)cold_start—Trueon first invocation of this worker process
cold_startsemantics.cold_start=Truemeans the first invocation observed by this Python worker process after module load. It is not a platform-level cold start metric and does not correspond to App Service plan / instance allocation cold starts reported by Azure Functions metrics. Subsequent invocations on the same worker emitcold_start=Falseuntil the worker is recycled.
def my_function(req, context):
with logging_context(context):
logger.info("handler started")
# every log from here carries invocation_id and cold_start
For lower-level control (e.g. middleware), use inject_context() with restore_context():
tokens = inject_context(context)
try:
logger.info("handler started")
finally:
restore_context(tokens)
Without context injection, these fields are None in every log line.
with_context Decorator
For less boilerplate, use the with_context decorator instead of calling inject_context() manually:
import azure.functions as func
from azure_functions_logging import get_logger, setup_logging, with_context
setup_logging()
logger = get_logger(__name__)
app = func.FunctionApp()
@app.route(route="hello")
@with_context
def hello(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
logger.info("Request received")
return func.HttpResponse("OK")
The decorator finds the context parameter by name, calls inject_context() before your handler runs, and restores the previous context in finally after it returns.
Custom parameter name:
@with_context(param="ctx")
def hello(req: func.HttpRequest, ctx: func.Context) -> func.HttpResponse:
...
Both sync and async handlers are supported.
Global LogRecordFactory (opt-in)
For applications where handlers may be added after setup_logging(), or where you want
invocation context on every LogRecord regardless of handler/filter configuration,
install the global context factory once at startup:
from azure_functions_logging import install_context_factory, setup_logging
install_context_factory() # injects context at record creation time
setup_logging()
When enabled, invocation_id, function_name, trace_id, and cold_start become
reserved LogRecord attributes. Passing them via stdlib extra= will raise KeyError.
Use FunctionLogger (which sanitizes keys automatically) or choose different key names.
Relationship with
setup_logging():setup_logging()still installsContextFilteron handlers by default. You can call both — they set the same values, so there is no conflict.install_context_factory()ensures coverage even on handlers added later or loggers that bypass the filter chain.
Structured JSON Output (Production)
Use JSON format when logs feed Application Insights or any aggregation system:
Note: The
formatparameter only affects handlers created by this library (local development). In Azure Functions, the host manages handlers. Usefunctions_formatter=JsonFormatter()to set JSON output on host-managed handlers. Passingformat="json"in Azure emits a warning.
For standalone local development or CI output:
setup_logging(format="json")
For Azure Functions / Core Tools, the host owns the handlers. To force JSON formatting on existing host-managed handlers:
from azure_functions_logging import JsonFormatter, setup_logging
setup_logging(functions_formatter=JsonFormatter())
Output per log line (NDJSON — one JSON object per line):
{"timestamp": "2024-01-15T10:30:00+00:00", "level": "INFO", "logger": "my_module",
"message": "order accepted", "invocation_id": "abc-123", "function_name": "OrderHandler",
"cold_start": false, "trace_id": "4bf92f3577b34da6a3ce929d0e0e4736", "exception": null,
"extra": {"order_id": "o-999"}}
Extra fields appear under extra in the emitted JSON. Whether they are directly indexable in Application Insights depends on your ingestion pipeline: when JSON is parsed into customDimensions they are queryable directly; when the JSON stays in the message column you need parse_json(message) first.
logger.info("order accepted", order_id="o-999", tenant_id="t-1")
host.json Conflict Detection
If your host.json suppresses log levels that your app emits, you get this warning at startup:
host.json logLevel for default is set to 'Warning' which is more restrictive than the configured level 'INFO'. Logs below 'Warning' will be suppressed by the Azure Functions host.
Recommended host.json baseline:
{
"version": "2.0",
"logging": {
"logLevel": {
"default": "Information",
"Function": "Information"
}
}
}
Discovery order
host.json is located by walking up from the current working directory:
cwd/host.json- Each parent directory, up to 5 levels deep.
The first existing file wins. To bypass auto-discovery (e.g. in tests or non-standard layouts), pass an explicit path:
from pathlib import Path
from azure_functions_logging import setup_logging
setup_logging(host_json_path=Path("/site/wwwroot/host.json"))
Noise Control
Suppress chatty third-party loggers without removing them:
from azure_functions_logging import SamplingFilter, setup_logging
import logging
setup_logging()
# Sample noisy azure.* loggers: keep up to 10 records per 1-second window.
# Filters attached to a logger don't run for records propagated from
# descendants, so attach to the root handlers and scope by logger name.
for handler in logging.getLogger().handlers:
handler.addFilter(SamplingFilter(rate=10, name="azure"))
# Silence urllib3 completely in production
logging.getLogger("urllib3").setLevel(logging.WARNING)
PII Redaction
Strip sensitive fields before they reach Application Insights:
from azure_functions_logging import RedactionFilter, setup_logging
import logging
setup_logging()
root = logging.getLogger()
# Attach the filter to handlers so records from named child loggers are also redacted.
for handler in root.handlers:
handler.addFilter(RedactionFilter(sensitive_keys=["password", "token", "secret"]))
Any log record with extra fields whose keys match a sensitive key will have those values replaced with ***.
Local vs Cloud
| Environment | Format | Behavior |
|---|---|---|
| Local terminal | color (default) |
Colorized human-readable: HH:MM:SS LEVEL logger message [context...] |
| Azure / Core Tools | host-managed | Installs context filters only; pass functions_formatter=JsonFormatter() to force NDJSON on host handlers |
| CI / pipeline | json |
NDJSON, machine-parseable |
setup_logging() detects FUNCTIONS_WORKER_RUNTIME to distinguish Azure Functions / Core Tools from standalone local execution. In Azure mode it installs context filters without adding handlers (avoids duplicate output from the host pipeline).
Context Binding
Attach request-scoped metadata to every log without passing it through every call:
def process_order(order_id: str) -> None:
order_logger = logger.bind(order_id=order_id, region="eastus")
order_logger.info("processing started") # includes order_id + region
order_logger.info("processing complete") # same metadata, new message
Create bound loggers per-invocation. Do not cache them at module level.
When to use
- You need structured, queryable logs in Application Insights
- You want
invocation_idcorrelation across all logs for a single request - You need cold start detection without custom instrumentation
- You want PII redaction or noise control for third-party loggers
- Your
host.jsonconfig silently suppresses logs and you don't know why
Documentation
- Full docs: yeongseon.github.io/azure-functions-logging-python
- Configuration reference
- Troubleshooting guide
- API reference
Ecosystem
This package is part of the Azure Functions Python DX Toolkit.
Design principle: azure-functions-logging owns structured logging and invocation-aware observability. It enriches Python's standard logging — it does not replace it. Adjacent concerns belong to azure-functions-openapi (API documentation and spec generation), azure-functions-validation (request/response validation and serialization), and azure-functions-langgraph (LangGraph runtime exposure).
| Package | Role |
|---|---|
| azure-functions-openapi-python | OpenAPI spec generation and Swagger UI |
| azure-functions-validation-python | Request/response validation and serialization |
| azure-functions-db-python | Database bindings for SQL, PostgreSQL, MySQL, SQLite, and Cosmos DB |
| azure-functions-langgraph-python | LangGraph deployment adapter for Azure Functions |
| azure-functions-scaffold-python | Project scaffolding CLI |
| azure-functions-logging-python | Structured logging and observability |
| azure-functions-doctor-python | Pre-deploy diagnostic CLI |
| azure-functions-durable-graph-python | Manifest-first graph runtime with Durable Functions (experimental) |
| azure-functions-knowledge-python | Knowledge retrieval (RAG) decorators |
| azure-functions-cookbook-python | Dogfood examples — runnable recipes that exercise the full toolkit |
For AI Coding Assistants
This package provides structured logging for Azure Functions with zero modifications to stdlib logging.
LLM-Friendly Resources:
llms.txt— Concise API reference and quick start (repo root)llms-full.txt— Complete API signatures, patterns, and design principles (repo root)
Key Implementation Details for Code Generation:
- Preserves host configuration — In Azure / Core Tools, no handlers are added and the root logger level is left to
host.json;ContextFilteris installed on existing root handlers and on the root logger itself (so direct calls on the root logger carry context). For records that propagate from named child loggers to handlers attached later (e.g. by the host or third-party libraries), callinstall_context_factory()to guarantee context coverage. In standalone local mode,setup_logging(logger_name=None)configures the root logger (sets level, adds aStreamHandlerif none exist). - Context injection is contextvar-based — Not thread-local, works with asyncio
- Idempotent setup — Calling setup_logging() multiple times is safe
- Two environments, two behaviors:
- Azure/Core Tools: install
ContextFilteron existing root handlers and on the root logger itself; do not add handlers or change the root level (respectshost.json). - Standalone local: set the target/root logger level; add a
StreamHandler(ColorFormatter or JsonFormatter) only if no handlers exist, otherwise just attach filters to existing handlers.
- Azure/Core Tools: install
- Test-friendly:
inject_context()accepts any object (no hard dependency on azure.functions.Context)with_contextdecorator works with sync and async handlers- Use
reset_context()in test teardown if needed
When generating code:
- Import only from
azure_functions_loggingpublic API (no underscores) - Call
setup_logging()at module level or handler startup (not per-request) - Prefer
with logging_context(context):in handlers; use rawinject_context(context)only withtry/finally restore_context(tokens) - Use
logger.bind(key=value)for per-request fields (not direct logger.extra) - Use
with_contextdecorator if you prefer to inject context implicitly per-handler - Call
get_logging_metadata(func)to inspect@with_contextmetadata on a function (returnsdict[str, Any] | None) - Apply
RedactionFilterfor PII fields,SamplingFilterfor high-volume logs
Example Pattern:
from azure_functions_logging import get_logger, logging_context, setup_logging
# Module level
setup_logging()
logger = get_logger(__name__)
# Per handler
def my_function(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
with logging_context(context):
req_logger = logger.bind(correlation_id=req.params.get("id"))
req_logger.info("Processing")
return func.HttpResponse("OK")
This project is an independent community project and is not affiliated with, endorsed by, or maintained by Microsoft.
Azure and Azure Functions are trademarks of Microsoft Corporation.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file azure_functions_logging-0.7.2.tar.gz.
File metadata
- Download URL: azure_functions_logging-0.7.2.tar.gz
- Upload date:
- Size: 263.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
461f46a4a876d823087bc6cdc7f3b6708e7c1b282c1c5f1bc79df83968518db6
|
|
| MD5 |
1fda4a44a00611bcf0248616137971fa
|
|
| BLAKE2b-256 |
2c63235deae8c11f4091485dc26b9b3415457967fea8a21526629f4f832d1c00
|
Provenance
The following attestation bundles were made for azure_functions_logging-0.7.2.tar.gz:
Publisher:
publish-pypi.yml on yeongseon/azure-functions-logging-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
azure_functions_logging-0.7.2.tar.gz -
Subject digest:
461f46a4a876d823087bc6cdc7f3b6708e7c1b282c1c5f1bc79df83968518db6 - Sigstore transparency entry: 1519183451
- Sigstore integration time:
-
Permalink:
yeongseon/azure-functions-logging-python@9d7708207eee9dfc54397e4e3e102ed939dea8bf -
Branch / Tag:
refs/tags/v0.7.2 - Owner: https://github.com/yeongseon
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@9d7708207eee9dfc54397e4e3e102ed939dea8bf -
Trigger Event:
push
-
Statement type:
File details
Details for the file azure_functions_logging-0.7.2-py3-none-any.whl.
File metadata
- Download URL: azure_functions_logging-0.7.2-py3-none-any.whl
- Upload date:
- Size: 29.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c5f7609e66d888c8d920fdb73975143befbbe20a6baaa65988f502106957f122
|
|
| MD5 |
6804f70e140ba103036bd84179516711
|
|
| BLAKE2b-256 |
fee91d37769fe72e369db7380ca8158225bd4c5d991740b7a517ccf93b4ed6d6
|
Provenance
The following attestation bundles were made for azure_functions_logging-0.7.2-py3-none-any.whl:
Publisher:
publish-pypi.yml on yeongseon/azure-functions-logging-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
azure_functions_logging-0.7.2-py3-none-any.whl -
Subject digest:
c5f7609e66d888c8d920fdb73975143befbbe20a6baaa65988f502106957f122 - Sigstore transparency entry: 1519183481
- Sigstore integration time:
-
Permalink:
yeongseon/azure-functions-logging-python@9d7708207eee9dfc54397e4e3e102ed939dea8bf -
Branch / Tag:
refs/tags/v0.7.2 - Owner: https://github.com/yeongseon
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@9d7708207eee9dfc54397e4e3e102ed939dea8bf -
Trigger Event:
push
-
Statement type: