Drop-in file + Postgres request/DB-call logging for FastAPI.
Project description
fastapi-pg-logger
Drop-in file + Postgres request and DB-call logging for FastAPI.
One setup_logging() call wires up:
- HTTP request/response logging: method, path, status, duration, headers, bodies (configurable truncation & sampling)
- File logging: daily-rotated JSON-line log files
- Postgres logging: partitioned tables with automatic monthly partitions
- DB-call logging: correlate individual SQL executions back to the originating HTTP request via
request_id - Log viewer UI: optional mountable router with a dark-themed AG Grid dashboard
Installation
pip install fastapi-pg-logger
Or install from source:
pip install git+https://github.com/bgeo-gis/fastapi-pg-logger.git
Dependencies
fastapi >= 0.100psycopg[binary,pool] >= 3.1- Python
>= 3.10
Quickstart
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi_pg_logger import setup_logging, LogConfig, create_log_router
config = LogConfig(service_name="my-api")
@asynccontextmanager
async def lifespan(app: FastAPI):
store = await setup_logging(app, config, dsn="postgresql://user:pass@localhost/mydb")
# Optionally mount the log viewer
if store:
app.include_router(create_log_router(store), prefix="/logs")
yield
# Close the internally-managed connection pool
if store:
await store.close()
app = FastAPI(lifespan=lifespan)
That's it. Every request is now logged to file and Postgres.
If you don't have Postgres (or don't want DB logging), omit the dsn parameter or set db_enabled=False in LogConfig. File logging still works.
Advanced: custom db_manager
If you need to supply your own connection pool or manager (e.g. shared pool, custom wrappers), use setup_logging_advanced:
from fastapi_pg_logger import setup_logging_advanced, LogConfig
store = await setup_logging_advanced(app, config, db_manager=my_db_manager)
The db_manager must expose an async context manager called get_db() that yields a psycopg async connection:
class MyDatabaseManager:
async def get_db(self):
async with self.pool.connection() as conn:
yield conn
Deprecation notice: Passing
db_managerdirectly tosetup_logging()is deprecated and will be removed in 1.0.0. Use thedsnparameter for simple setups, or migrate tosetup_logging_advanced()for custom connection management.
Configuration
All configuration is passed explicitly via LogConfig.
from fastapi_pg_logger import LogConfig
config = LogConfig(
# Service identity (used in file log naming)
service_name="my-api",
# File logging
log_dir="logs", # root directory
log_level="INFO", # DEBUG, INFO, WARNING, ERROR, CRITICAL
log_rotate_days=14, # how many days of backup logs to keep
log_format="[%(asctime)s] %(levelname)s:%(name)s:%(message)s",
log_date_format="%d/%m/%y %H:%M:%S",
# Request body / header capture
skip_body_prefixes=("/logs", "/health", "/docs", "/openapi.json"),
header_allowlist=None, # None = built-in default set
max_body_bytes=0, # 0 = no truncation
request_id_header="X-Request-ID",
# Postgres logging
db_enabled=True,
db_sample_rate=1.0, # 0.0–1.0, fraction of requests logged to DB
db_schema="log",
api_logs_table="api_logs",
db_logs_table="api_db_logs",
)
Default header allowlist
When header_allowlist is None, these headers are captured:
accept, accept-encoding, accept-language, cache-control, content-length, content-type, etag, user-agent, x-device, x-lang, x-forwarded-for, x-real-ip, x-request-id
DB-call logging
If your API executes SQL queries or stored procedures, you can log each call and correlate it with the parent HTTP request:
import time
from fastapi_pg_logger import get_request_id, log_db_call
async def execute_procedure(store, schema, function_name, sql):
start = time.monotonic()
result = await run_sql(sql)
duration_ms = int((time.monotonic() - start) * 1000)
log_db_call(
store,
# request_id auto-read from context if omitted
schema_name=schema,
function_name=function_name,
sql_text=sql,
response_json=json.dumps(result),
duration_ms=duration_ms,
status="ok" if result else "error",
)
return result
log_db_call is fire-and-forget, it schedules a background task and never raises.
Reading the request ID manually
from fastapi_pg_logger import get_request_id
rid = get_request_id() # uuid.UUID | None
Log viewer
Mount the optional log viewer router to get a browser-based dashboard:
from fastapi_pg_logger import create_log_router
router = create_log_router(store, auth_dependency=my_auth_dep)
app.include_router(router, prefix="/logs")
This adds:
| Endpoint | Description |
|---|---|
GET /logs |
Paginated, filterable request logs (JSON) |
GET /logs/db |
DB-call logs for a given request_id |
GET /logs/ui |
Dark-themed HTML log viewer |
The auth_dependency parameter accepts any FastAPI dependency. Pass None for no authentication.
Postgres schema
The package auto-creates (idempotent) the following structure:
{db_schema}.{api_logs_table} (partitioned by month)
| Column | Type |
|---|---|
ts |
timestamptz |
id |
bigserial |
method |
text |
endpoint |
text |
status |
integer |
duration_ms |
integer |
user_name |
text |
request_id |
uuid |
client_ip |
inet |
query_params |
jsonb |
body_size |
integer |
response_size |
integer |
request_headers |
jsonb |
request_body |
text |
response_headers |
jsonb |
response_body |
text |
{db_schema}.{db_logs_table} (partitioned by month)
| Column | Type |
|---|---|
ts |
timestamptz |
id |
bigserial |
request_id |
uuid |
schema_name |
text |
function_name |
text |
sql_text |
text |
response_json |
text |
duration_ms |
integer |
status |
text |
error |
text |
Partitions are created automatically for the current month on startup and on each insert.
Production notes
- All DB writes are wrapped in
try/except, logging failures never crash the app. - Partition creation handles concurrent race conditions (catches
DuplicateTable). - File logging works independently of Postgres availability.
- DB inserts run in
asyncio.create_task, zero impact on response latency. - Body truncation via
max_body_bytesprevents memory spikes on large payloads. db_sample_ratelets you log a fraction of requests for high-traffic APIs.- No root logger manipulation, only the package's own named logger is used.
app.statekeys are prefixed with_fpgl_to avoid collisions.
License
This project is licensed under the GNU General Public License v3.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fastapi_pg_logger-0.2.0.tar.gz.
File metadata
- Download URL: fastapi_pg_logger-0.2.0.tar.gz
- Upload date:
- Size: 38.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a5147147e7d6574d99feabc8b48a7a01d4443249b716d4a9e7493f0114b10544
|
|
| MD5 |
369d1b7b4379e3f7d8491e28eac24916
|
|
| BLAKE2b-256 |
7add0ff6d0cf88bfba6e91d7bceb2a00934a9ef5f36354403c943cda2a4354ba
|
Provenance
The following attestation bundles were made for fastapi_pg_logger-0.2.0.tar.gz:
Publisher:
publish-to-pypi.yml on bgeo-gis/fastapi-pg-logger
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fastapi_pg_logger-0.2.0.tar.gz -
Subject digest:
a5147147e7d6574d99feabc8b48a7a01d4443249b716d4a9e7493f0114b10544 - Sigstore transparency entry: 1247202517
- Sigstore integration time:
-
Permalink:
bgeo-gis/fastapi-pg-logger@de0522ae75f8a8dccb85912e46ee0c1ed67cdef7 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/bgeo-gis
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-to-pypi.yml@de0522ae75f8a8dccb85912e46ee0c1ed67cdef7 -
Trigger Event:
push
-
Statement type:
File details
Details for the file fastapi_pg_logger-0.2.0-py3-none-any.whl.
File metadata
- Download URL: fastapi_pg_logger-0.2.0-py3-none-any.whl
- Upload date:
- Size: 38.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9ad2b1bedd6d4000482cade73f914e691a01e20c33015c0eaab70329d1aeed36
|
|
| MD5 |
be81842b5d6089c4fa05fe78df17aad7
|
|
| BLAKE2b-256 |
85be93037ed1deae817e85645362de5432e15cd8c7d7da5e6a15f613b2b42143
|
Provenance
The following attestation bundles were made for fastapi_pg_logger-0.2.0-py3-none-any.whl:
Publisher:
publish-to-pypi.yml on bgeo-gis/fastapi-pg-logger
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fastapi_pg_logger-0.2.0-py3-none-any.whl -
Subject digest:
9ad2b1bedd6d4000482cade73f914e691a01e20c33015c0eaab70329d1aeed36 - Sigstore transparency entry: 1247202562
- Sigstore integration time:
-
Permalink:
bgeo-gis/fastapi-pg-logger@de0522ae75f8a8dccb85912e46ee0c1ed67cdef7 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/bgeo-gis
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-to-pypi.yml@de0522ae75f8a8dccb85912e46ee0c1ed67cdef7 -
Trigger Event:
push
-
Statement type: