Automatic HTTP request logging for Python. Records every outgoing API call (httpx, requests) to your database with batch buffering, Django Admin & FastAPI admin panel support. Zero-config, production-ready.
Project description
apimemo
Automatic HTTP request logging for Python. Records every outgoing API call to your database — zero config, batch-buffered, production-ready.
Why apimemo?
Most applications make dozens of outgoing HTTP requests — to payment gateways, notification services, third-party APIs. When something goes wrong, you need answers: What did we send? What came back? How long did it take?
apimemo wraps your HTTP client (httpx or requests) and silently logs every request to your database. No manual logging. No code changes to every call site. Just plug in and see everything in your admin panel.
Key Features
- Zero-effort logging — wrap your HTTP client once, every request is captured automatically
- Batch-buffered writes — logs are collected in memory and flushed in configurable batches to minimize database overhead
- Thread-safe — safe for multi-threaded applications with proper locking and daemon timers
- Framework-native — works with your existing ORM and migration tool, no separate database needed
- Admin panel included — Django Admin and starlette-admin (FastAPI) views out of the box
- Configurable filtering — ignore health checks, internal services, or specific paths with glob patterns
- Body truncation — automatically truncates large request/response bodies to keep your database lean
- Graceful shutdown —
atexithook ensures no logs are lost when your process exits
Installation
pip install apimemo[django] # Django + httpx
pip install apimemo[sqlalchemy] # FastAPI + SQLAlchemy (async)
pip install apimemo[tortoise] # FastAPI + Tortoise ORM
pip install apimemo[httpx] # httpx interceptor only (bring your own storage)
pip install apimemo[requests] # requests interceptor only
pip install apimemo[fastapi-admin] # starlette-admin panel for FastAPI
pip install apimemo[all] # everything
Quick Start
Django
Add "apimemo" to your installed apps and run migrations — that's it.
# settings.py
INSTALLED_APPS = [
"apimemo",
...
]
python manage.py migrate
# anywhere in your code
from apimemo.integrations.django import DjangoIntegration
integration = DjangoIntegration()
client = integration.get_client()
resp = client.get("https://api.stripe.com/v1/charges")
# → automatically logged to api_logs table
# → visible at /admin/apimemo/apilog/
FastAPI + SQLAlchemy
from sqlalchemy.ext.asyncio import create_async_engine
from apimemo.integrations.sqlalchemy import SqlAlchemyIntegration, ApiLogMixin
# 1. Define your model with the mixin
class ApiLog(Base, ApiLogMixin):
__tablename__ = "api_logs"
# 2. Generate & apply migration
# alembic revision --autogenerate -m "add api_logs"
# alembic upgrade head
# 3. Create integration
engine = create_async_engine("postgresql+asyncpg://...")
integration = SqlAlchemyIntegration(engine)
# 4. Use the logged client
client = integration.get_async_client()
resp = await client.get("https://api.example.com/users")
# 5. (Optional) Mount admin panel
# pip install apimemo[fastapi-admin]
integration.mount_admin(app)
FastAPI + Tortoise ORM
# tortoise config
TORTOISE_ORM = {
"apps": {
"apimemo": {
"models": ["apimemo.integrations.tortoise"],
"default_connection": "default",
}
}
}
# aerich migrate && aerich upgrade
from apimemo.integrations.tortoise import TortoiseIntegration
integration = TortoiseIntegration()
client = integration.get_async_client()
resp = await client.get("https://api.example.com/users")
Using requests instead of httpx
Every integration also provides a get_session() method that returns a requests.Session:
session = integration.get_session()
resp = session.post("https://api.example.com/webhook", json={"event": "test"})
Configuration
from apimemo import configure
configure(
enabled=True, # kill switch — disable all logging
max_body_size=10240, # truncate bodies larger than 10KB
batch_size=50, # flush after collecting 50 entries
flush_interval=5.0, # or flush every 5 seconds, whichever comes first
ignore_hosts=("localhost", "*.internal.io"), # skip these hosts (glob patterns)
ignore_paths=("/health", "/metrics*"), # skip these paths (glob patterns)
log_request_body=True, # capture request body
log_response_body=True, # capture response body
log_headers=False, # disabled by default — headers may contain auth tokens
)
Configuration is incremental — calling configure(log_headers=True) preserves all other settings. Call configure() with no arguments to reset to defaults.
What Gets Logged
Every outgoing HTTP request produces a row in api_logs:
| Column | Type | Description |
|---|---|---|
id |
UUID | Primary key |
method |
VARCHAR(10) | GET, POST, PUT, DELETE, etc. |
url |
TEXT | Full request URL |
host |
VARCHAR(255) | Target hostname (indexed) |
path |
VARCHAR(2048) | URL path |
status_code |
INTEGER | Response status code, 0 if request failed (indexed) |
request_headers |
TEXT | JSON-encoded request headers (if enabled) |
request_body |
TEXT | Request body (truncated to max_body_size) |
response_headers |
TEXT | JSON-encoded response headers (if enabled) |
response_body |
TEXT | Response body (truncated to max_body_size) |
duration_ms |
FLOAT | Request duration in milliseconds |
error |
TEXT | Exception message if request failed |
created_at |
DATETIME | Timestamp in UTC (indexed) |
Architecture
┌─────────────┐ ┌──────────────┐ ┌───────────┐ ┌──────────┐
│ httpx/ │────▶│ LogBuffer │────▶│ _flush() │────▶│ Database │
│ requests │ │ (thread-safe │ │ (batch │ │ │
│ interceptor │ │ in-memory) │ │ insert) │ │ │
└─────────────┘ └──────────────┘ └───────────┘ └──────────┘
▲ batch_size trigger
▲ timer trigger (flush_interval)
▲ atexit trigger
- Interceptors wrap httpx transports and requests sessions to capture request/response data
- LogBuffer collects entries in a thread-safe list and flushes on batch size, timer, or process exit
- Integrations provide the
_flush()implementation specific to each ORM (SQLAlchemy async, Tortoise bulk_create, Django raw SQL)
Supported Frameworks
| Framework | ORM | Migration Tool | Admin Panel |
|---|---|---|---|
| Django | Django ORM | manage.py migrate |
Django Admin |
| FastAPI | SQLAlchemy 2.0 (async) | Alembic --autogenerate |
starlette-admin |
| FastAPI | Tortoise ORM | Aerich | — |
| Any | — | — | — (bring your own storage) |
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file apimemo-0.1.0.tar.gz.
File metadata
- Download URL: apimemo-0.1.0.tar.gz
- Upload date:
- Size: 16.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1f1dd2316cbebde60519aa0abc0ca9b68fa3002dd2516fa2858379d16811e5cd
|
|
| MD5 |
756c86cd2f80d155aec97db7a6b4e17f
|
|
| BLAKE2b-256 |
dd268736a9d00e2e5fcb0e10d976ba7b89df5425c21cc73ad73ab16c39284ed4
|
File details
Details for the file apimemo-0.1.0-py3-none-any.whl.
File metadata
- Download URL: apimemo-0.1.0-py3-none-any.whl
- Upload date:
- Size: 20.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
10da6fa37c7060d9ce115b3bd27163b58e52c11c3955d2d3dfea950608350bce
|
|
| MD5 |
54c5f713f6dee8851e872d41f7c240e4
|
|
| BLAKE2b-256 |
f2c174998920f657abc010b3c7b4e03b99db5daf5918001b681c3887957872cf
|