Skip to main content

Drop-in file + Postgres request/DB-call logging for FastAPI.

Project description

fastapi-pg-logger

Drop-in file + Postgres request and DB-call logging for FastAPI.

One setup_logging() call wires up:

  • HTTP request/response logging: method, path, status, duration, headers, bodies (configurable truncation & sampling)
  • File logging: daily-rotated JSON-line log files
  • Postgres logging: partitioned tables with automatic monthly partitions
  • DB-call logging: correlate individual SQL executions back to the originating HTTP request via request_id
  • Log viewer UI: optional mountable router with a dark-themed AG Grid dashboard

Installation

pip install fastapi-pg-logger

Or install from source:

pip install git+https://github.com/bgeo-gis/fastapi-pg-logger.git

Dependencies

  • fastapi >= 0.100
  • psycopg[binary,pool] >= 3.1
  • Python >= 3.10

Quickstart

from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi_pg_logger import setup_logging, LogConfig, create_log_router

config = LogConfig(service_name="my-api")

@asynccontextmanager
async def lifespan(app: FastAPI):
    store = await setup_logging(app, config, db_manager=my_db_manager)
    # Optionally mount the log viewer
    if store:
        app.include_router(create_log_router(store), prefix="/logs")
    yield

app = FastAPI(lifespan=lifespan)

That's it. Every request is now logged to file and Postgres.

db_manager contract

The db_manager argument must expose an async context manager called get_db() that yields a psycopg async connection:

class MyDatabaseManager:
    async def get_db(self):
        # Must be an async context manager yielding a psycopg async connection
        async with self.pool.connection() as conn:
            yield conn

If you don't have Postgres (or don't want DB logging), pass db_manager=None or set db_enabled=False in LogConfig. File logging still works.


Configuration

All configuration is passed explicitly via LogConfig.

from fastapi_pg_logger import LogConfig

config = LogConfig(
    # Service identity (used in file log naming)
    service_name="my-api",

    # File logging
    log_dir="logs",                # root directory
    log_level="INFO",              # DEBUG, INFO, WARNING, ERROR, CRITICAL
    log_rotate_days=14,            # how many days of backup logs to keep
    log_format="[%(asctime)s] %(levelname)s:%(name)s:%(message)s",
    log_date_format="%d/%m/%y %H:%M:%S",

    # Request body / header capture
    skip_body_prefixes=("/logs", "/health", "/docs", "/openapi.json"),
    header_allowlist=None,         # None = built-in default set
    max_body_bytes=0,              # 0 = no truncation
    request_id_header="X-Request-ID",

    # Postgres logging
    db_enabled=True,
    db_sample_rate=1.0,            # 0.0–1.0, fraction of requests logged to DB
    db_schema="log",
    api_logs_table="api_logs",
    db_logs_table="api_db_logs",
)

Default header allowlist

When header_allowlist is None, these headers are captured:

accept, accept-encoding, accept-language, cache-control, content-length, content-type, etag, user-agent, x-device, x-lang, x-forwarded-for, x-real-ip, x-request-id


DB-call logging

If your API executes SQL queries or stored procedures, you can log each call and correlate it with the parent HTTP request:

import time
from fastapi_pg_logger import get_request_id, log_db_call

async def execute_procedure(store, db_manager, schema, function_name, sql):
    start = time.monotonic()
    result = await run_sql(db_manager, sql)
    duration_ms = int((time.monotonic() - start) * 1000)

    log_db_call(
        store,
        # request_id auto-read from context if omitted
        schema_name=schema,
        function_name=function_name,
        sql_text=sql,
        response_json=json.dumps(result),
        duration_ms=duration_ms,
        status="ok" if result else "error",
    )
    return result

log_db_call is fire-and-forget, it schedules a background task and never raises.

Reading the request ID manually

from fastapi_pg_logger import get_request_id

rid = get_request_id()  # uuid.UUID | None

Log viewer

Mount the optional log viewer router to get a browser-based dashboard:

from fastapi_pg_logger import create_log_router

router = create_log_router(store, auth_dependency=my_auth_dep)
app.include_router(router, prefix="/logs")

This adds:

Endpoint Description
GET /logs Paginated, filterable request logs (JSON)
GET /logs/db DB-call logs for a given request_id
GET /logs/ui Dark-themed HTML log viewer

The auth_dependency parameter accepts any FastAPI dependency. Pass None for no authentication.


Postgres schema

The package auto-creates (idempotent) the following structure:

{db_schema}.{api_logs_table} (partitioned by month)

Column Type
ts timestamptz
id bigserial
method text
endpoint text
status integer
duration_ms integer
user_name text
request_id uuid
client_ip inet
query_params jsonb
body_size integer
response_size integer
request_headers jsonb
request_body text
response_headers jsonb
response_body text

{db_schema}.{db_logs_table} (partitioned by month)

Column Type
ts timestamptz
id bigserial
request_id uuid
schema_name text
function_name text
sql_text text
response_json text
duration_ms integer
status text
error text

Partitions are created automatically for the current month on startup and on each insert.


Production notes

  • All DB writes are wrapped in try/except, logging failures never crash the app.
  • Partition creation handles concurrent race conditions (catches DuplicateTable).
  • File logging works independently of Postgres availability.
  • DB inserts run in asyncio.create_task, zero impact on response latency.
  • Body truncation via max_body_bytes prevents memory spikes on large payloads.
  • db_sample_rate lets you log a fraction of requests for high-traffic APIs.
  • No root logger manipulation, only the package's own named logger is used.
  • app.state keys are prefixed with _fpgl_ to avoid collisions.

License

This project is licensed under the GNU General Public License v3.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastapi_pg_logger-0.1.0.tar.gz (37.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fastapi_pg_logger-0.1.0-py3-none-any.whl (37.6 kB view details)

Uploaded Python 3

File details

Details for the file fastapi_pg_logger-0.1.0.tar.gz.

File metadata

  • Download URL: fastapi_pg_logger-0.1.0.tar.gz
  • Upload date:
  • Size: 37.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fastapi_pg_logger-0.1.0.tar.gz
Algorithm Hash digest
SHA256 0d96b0c38b374048aaf43962f57af7f1005532adec8189a5e570074cc9eda49a
MD5 fb36995199213b4708a6707f5b767a97
BLAKE2b-256 85e1bdf5353d747b9341f55a466fa50c8e75ebad382bf248936d6a11c8c00c8f

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastapi_pg_logger-0.1.0.tar.gz:

Publisher: publish-to-pypi.yml on bgeo-gis/fastapi-pg-logger

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastapi_pg_logger-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for fastapi_pg_logger-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 db5d5b97450e6586a487b20ac9b348c7608ced455d5e7d0c11377cee24223000
MD5 79c75e9c6432c3f23b1ce6597898fabc
BLAKE2b-256 fd8e5c1108236540ca21fd0f8c8cda2f482b5371a66e7e8e1cdd762bfe59eaea

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastapi_pg_logger-0.1.0-py3-none-any.whl:

Publisher: publish-to-pypi.yml on bgeo-gis/fastapi-pg-logger

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page