Skip to main content

Reusable API response and exception handling utilities with FastAPI integration.

Project description

aniket_tools

A reusable Python library for three things:

  • building consistent API responses (create_response, value_correction)
  • translating any exception into a safe, structured JSON payload (ErrorHandler, ApiError)
  • structured logging with levels, context, redaction, and timers (logs, get_logger)

Install

pip install aniket_tools

For local development:

pip install -e ".[full]"

Import

from aniket_tools import (
    ApiError,
    ErrorHandler,
    ExceptionHandler,
    PaginationRes,
    create_response,
    explain_error,
    get_logger,
    get_status_code,
    handle_exception,
    logs,
    unified_exception_handler,
    value_correction,
)

Quick Start — FastAPI

from fastapi import FastAPI, HTTPException
from fastapi.exceptions import RequestValidationError
from aniket_tools import create_response, unified_exception_handler

app = FastAPI()
app.add_exception_handler(HTTPException, unified_exception_handler)
app.add_exception_handler(Exception, unified_exception_handler)
app.add_exception_handler(RequestValidationError, unified_exception_handler)

@app.get("/health")
async def health():
    return create_response(200, data={"status": "ok"})

create_response

Builds the standard success or error payload and returns a JSONResponse (or a plain dict with as_json_response=False).

Parameters:

Param Type Default Purpose
response_code int required HTTP status code
data Any None Response body data
schema Pydantic model class None Validates data before returning
pagination dict | PaginationRes None Pagination metadata
error_message str None Error description
error_code str None Machine-readable error code
details list[dict] None Field-level error details
meta dict | model None Request metadata (request_id, trace_id, …)
as_json_response bool True Return JSONResponse vs plain dict

Success responses

# Minimal
create_response(200, data={"status": "ok"})

# With meta
create_response(200, data={"name": "Aniket"}, meta={"request_id": "req-1", "trace_id": "t-2"})

# With pagination — dict form
create_response(200, data=rows, pagination={"page": 1, "rows": 25, "total_rows": 250})

# With pagination — typed model
create_response(200, data=rows, pagination=PaginationRes(page=1, rows=25, total_rows=250))

# Schema validation — Pydantic model class validates each item
create_response(200, data=raw_rows, schema=UserSchema)

# No content
create_response(204)   # returns FastAPI Response(status_code=204) — no body

# Return plain dict instead of JSONResponse (useful in tests or non-FastAPI code)
payload = create_response(200, data={"ok": True}, as_json_response=False)

Error responses

# Simple 404
create_response(404, error_message="Report not found.", error_code="report_not_found")

# 422 with field-level details
create_response(
    422,
    error_message="One or more fields are invalid.",
    error_code="validation_error",
    details=[
        {"field": "email", "message": "field required"},
        {"field": "age",   "message": "must be a positive integer"},
    ],
)

Success response shape

{
  "success": true,
  "response_code": 200,
  "meta": {"request_id": "req-1"},
  "data": {"name": "Aniket"},
  "pagination": {"page": 1, "rows": 25, "total_rows": 250}
}

Error response shape

{
  "success": false,
  "response_code": 422,
  "error_message": "One or more fields are invalid.",
  "meta": {},
  "error": {
    "code": "validation_error",
    "message": "One or more fields are invalid.",
    "details": [{"field": "email", "message": "field required"}]
  },
  "errors": [{"field": "email", "message": "field required"}]
}

error_message and error.message are aliases. errors and error.details are aliases. Both exist for backward compatibility.


PaginationRes

Typed dataclass for pagination metadata.

from aniket_tools import PaginationRes

p = PaginationRes(page=1, rows=25, total_rows=250)
create_response(200, data=rows, pagination=p)

Validation rules inside create_response:

  • page ≥ 1
  • rows ≥ 0
  • total_rows ≥ 0
  • All three are required integers
  • Extra keys on the dict form are preserved

If any rule fails, create_response returns a 422 validation error instead of a broken payload.


value_correction

Recursively normalizes Python values into JSON-safe types. Called automatically by create_response on all data.

Parameters:

Param Type Default Purpose
data Any required Value to normalize
mode str "response" "response" enables all conversions; "storage" disables most
float_precision int | None 2 in response mode Decimal places to round floats to
datetime_format str "%Y-%m-%d %H:%M:%S" Output format for datetime values
date_format str "%Y-%m-%d" Output format for date values
strip_strings bool True Strip whitespace from strings

What it converts by default:

Input type Output
str stripped string
bytes UTF-8 decoded string
Decimal float (rounded to 2 dp)
datetime "2024-01-15 09:30:00"
date "2024-01-15"
timedelta "0:01:30"
float NaN / Inf None
float rounded to 2 dp
UUID "550e8400-..."
Enum enum value (recursed)
dataclass dict (recursed)
Pydantic model dict (recursed)
numpy.integer int
numpy.floating float (recursed)
numpy.ndarray list (recursed)
dict keys and values recursed
list / tuple / set recursed to list
None, bool, int unchanged
from decimal import Decimal
from datetime import datetime
from uuid import UUID
from aniket_tools import value_correction

value_correction({
    "amount":   Decimal("10.567"),
    "created":  datetime(2024, 1, 15, 9, 30),
    "name":     "  Aniket  ",
    "rate":     float("nan"),
    "id":       UUID("550e8400-e29b-41d4-a716-446655440000"),
})
# → {
#     "amount":  10.57,
#     "created": "2024-01-15 09:30:00",
#     "name":    "Aniket",
#     "rate":    None,
#     "id":      "550e8400-e29b-41d4-a716-446655440000",
# }

# Custom float precision
value_correction(3.14159, float_precision=4)  # → 3.1416

# Storage mode — most conversions disabled
value_correction(Decimal("10.5"), mode="storage")  # → Decimal("10.5") unchanged

logs

Unified logging function. Handles plain messages, structured context, redaction, SQL queries, ASCII tables, JSON pretty-print, timers, and file output.

Parameters:

Param Type Default Purpose
msg object "" Message, data structure, or SQL statement
type str "info" Log level / mode (see table below)
file_name str | Path None Also write to this file (auto-creates dirs)
logger Logger None Use a specific logger instead of the default
dialect object None SQLAlchemy dialect for type="query"
context dict None Key-value fields appended to the log line
exc_info bool | Exception False Attach exception traceback
redact list[str] None Context keys to mask as ***
sample_rate float None 0.0–1.0 — drop this fraction of calls
indent int 4 JSON indent size for type="json"

Log types / levels

type Level Color Use for
"trace" 5 dim Very fine-grained internal tracing
"debug" 10 cyan Developer debug info
"info" 20 default General status messages
"success" 25 green Positive confirmations
"warning" 30 yellow Non-critical concerns
"error" 40 red Errors and failures
"critical" 50 bold red System-level failures
"audit" 45 magenta Compliance / security events
"exception" 40 red Same as error + auto-attaches traceback
"query" 20 default SQL statements (auto-compiles with literals)
"table" 20 default list[dict] → ASCII table
"divider" 20 default Section separator line
"timer" 20 default Context manager — logs elapsed seconds
"json" 20 default Pretty-prints any JSON-serializable object

Examples

from aniket_tools import logs, get_logger

# Standard levels
logs("Server started")
logs("Connecting to DB", type="debug")
logs("Disk above 80%",  type="warning")
logs("Save failed",     type="error")
logs("Out of memory",   type="critical")

# Custom levels
logs("Entering resolve_user",         type="trace")
logs("Payment processed",             type="success")
logs("User admin deleted record #42", type="audit")

# Exception with traceback
try:
    raise ValueError("bad input")
except Exception as e:
    logs("Caught error", type="error", exc_info=e)

# Or the shorthand
logs("DB failed", type="exception")   # auto-attaches current exception

# Context fields
logs("User logged in", context={"user_id": 42, "ip": "10.0.0.1"})
# → ... | INFO | User logged in | user_id=42 ip=10.0.0.1

# Redaction
logs("API call", context={"api_key": "secret123", "endpoint": "/v1"}, redact=["api_key"])
# → ... | INFO | API call | api_key=*** endpoint=/v1

# SQL query (plain string)
logs("SELECT * FROM users WHERE id = 1", type="query")

# SQL query (SQLAlchemy statement with bound params)
from sqlalchemy import select
stmt = select(User).where(User.id == 7)
logs(stmt, type="query", dialect="postgresql")
logs(stmt, type="query", dialect=session)   # session / engine also accepted

# ASCII table
logs([{"id": 1, "name": "Alice"}, {"id": 2, "name": "Bob"}], type="table")

# Divider
logs("Auth Section", type="divider")   # ──── Auth Section ─────────────────────

# Timer
import time
with logs("heavy query", type="timer"):
    time.sleep(0.1)
# → ... | INFO | TIMER[heavy query]: 0.1012s

# JSON pretty-print
logs({"user": "alice", "roles": ["admin", "editor"]}, type="json")

# Write to file (also still logs to console)
logs("Report generated", file_name="logs/app.log")

# Sampling — only ~10% of calls produce output
logs("high-frequency event", sample_rate=0.1)

get_logger

Returns a configured logging.Logger.

Parameters:

Param Type Default Purpose
name str "aniket" Logger name
file_name str | Path None Log file path
json bool False Emit structured JSON lines instead of plain text
rotate bool False Rotate at 10 MB, keep 5 backups
sample_rate float None Drop this fraction of all logs from this logger
from aniket_tools import get_logger, logs

# Plain logger
lg = get_logger("my_app")
logs("started", logger=lg)

# File + rotation
lg = get_logger("my_app", file_name="logs/app.log", rotate=True)
logs("started", logger=lg)

# JSON output — ready for Datadog, Loki, ELK
lg = get_logger(json=True)
logs("User created", context={"user_id": 99, "env": "prod"}, logger=lg)
# → {"time": "...", "name": "aniket", "level": "INFO", "msg": "User created", "user_id": 99, "env": "prod"}

# JSON + redaction
logs("Login", context={"user": "admin", "password": "hunter2"}, redact=["password"], logger=lg)
# → {"time": "...", ..., "user": "admin", "password": "***"}

# 50% sampling on the logger level
lg = get_logger(sample_rate=0.5)
logs("background event", logger=lg)

ApiError

Raise a controlled API error from anywhere in your code.

Parameters:

Param Type Default Purpose
message str required User-facing error text
status_code int 400 HTTP status code
code str "api_error" Machine-readable error code
details list[dict] None Field-level details
log_message str None Extra developer context (logged, not returned)
from aniket_tools import ApiError

# Simple
raise ApiError("Report not found.", status_code=404, code="report_not_found")

# With field-level details
raise ApiError(
    "Validation failed.",
    status_code=422,
    code="validation_error",
    details=[{"field": "email", "message": "already registered"}],
)

# With a private log message (not sent to the client)
raise ApiError(
    "Something went wrong.",
    status_code=500,
    code="internal_error",
    log_message=f"DB query failed on table=billing sql={raw_sql}",
)

When unified_exception_handler catches an ApiError, the message, status_code, code, and details are returned exactly as provided. The log_message is written to the error log but never included in the response.


ErrorHandler

Core exception classifier. Understands 60+ exception types across all major Python libraries.

Parameters:

Param Type Default Purpose
logger_name str "aniket_tools.errors" Name of the logger used for log_exception
use_default_message_for_long_errors bool True Replace long raw messages with safe defaults
from aniket_tools import ErrorHandler

handler = ErrorHandler()

# Classify any exception into a structured ErrorInfo
info = handler.describe(some_exception)
print(info.status_code)   # e.g. 422
print(info.code)          # e.g. "duplicate_resource"
print(info.message)       # e.g. "A record with this email already exists."
print(info.retryable)     # True / False / None

# Build the full JSON payload
payload = handler.build_payload(some_exception, meta={"trace_id": "t-1"})

# Log the raw exception and return JSONResponse
response = handler.handle_exception(some_exception, request=request)

# Log only (no response)
handler.log_exception(some_exception, request=request)

Exception error response shape

{
  "success": false,
  "response_code": 409,
  "error_message": "A record with this email already exists.",
  "error_type": "IntegrityError",
  "meta": {"request_id": "req-1", "path": "/users"},
  "error": {
    "code": "duplicate_resource",
    "type": "IntegrityError",
    "message": "A record with this email already exists.",
    "retryable": false,
    "details": [
      {"type": "duplicate_resource", "field": "email", "value": "a@b.com", "constraint": "users_email_key"}
    ]
  },
  "errors": [
    {"type": "duplicate_resource", "field": "email", "value": "a@b.com", "constraint": "users_email_key"}
  ]
}

retryable field:

  • true — client should retry (timeouts, deadlocks, transient unavailability, cache/queue conflicts)
  • false — retrying will not help (duplicate key, bad input, auth failure, SSL error)
  • absent — not determined for this error type

Exception families covered

Family Libraries Example codes
HTTP / Validation FastAPI, Starlette, Pydantic validation_error, http_404
Database SQLAlchemy, psycopg2, psycopg3, asyncpg, MySQL Connector, PyMySQL, MySQLdb, sqlite3, PyMongo duplicate_resource, invalid_reference, database_timeout, database_unavailable, database_retryable_conflict
Upstream HTTP requests, httpx, aiohttp, urllib3 upstream_timeout, upstream_unavailable, upstream_ssl_error, upstream_bad_response
Auth PyJWT token_expired, invalid_token, invalid_token_claim
Cloud botocore / boto3 cloud_timeout, cloud_not_found, cloud_rate_limited, cloud_forbidden
Cache Redis cache_timeout, cache_conflict, cache_unavailable, cache_auth_failed
Queue / Tasks kafka-python, confluent_kafka, Celery, Kombu queue_timeout, queue_unavailable, task_timeout, invalid_queue_payload
Data tools Pandas, NumPy, PyArrow, Polars, SciPy invalid_data, data_backend_unavailable
Python builtins stdlib invalid_json, invalid_yaml, resource_not_found, bad_request, internal_error

Common error codes and status codes

Code Status Retryable Cause
duplicate_resource 409 false Unique constraint violation
invalid_reference 422 false Foreign key violation
missing_required_field 422 false NOT NULL violation
constraint_violation 422 false CHECK constraint
database_retryable_conflict 409 true Deadlock / serialization failure
database_timeout 504 true Statement / network timeout
database_unavailable 503 true Cannot connect to DB server
database_programming_error 500 false Undefined table / SQL syntax bug
validation_error 422 false Request field validation
upstream_timeout 504 true HTTP client timeout
upstream_unavailable 503 true Cannot reach upstream service
upstream_ssl_error 502 false TLS / certificate failure
token_expired 401 false JWT expired
invalid_token 401 false JWT invalid signature / decode
cloud_rate_limited 429 true Cloud SDK throttle
cache_timeout 504 true Redis timeout
cache_conflict 409 true Redis WATCH / lock conflict
queue_timeout 504 true Kafka / Kombu timeout
task_timeout 504 false Celery time limit hit
invalid_json 400 false Malformed JSON body
bad_request 400 false ValueError, TypeError, etc.
internal_error 500 false Uncaught programming bug

unified_exception_handler

FastAPI exception handler. Logs the raw error and returns the standard error JSON.

from fastapi import FastAPI, HTTPException
from fastapi.exceptions import RequestValidationError
from aniket_tools import unified_exception_handler

app = FastAPI()
app.add_exception_handler(HTTPException,           unified_exception_handler)
app.add_exception_handler(Exception,               unified_exception_handler)
app.add_exception_handler(RequestValidationError,  unified_exception_handler)

ExceptionHandler

Route-level helper. Converts any exception into a FastAPI HTTPException so FastAPI's own handler picks it up.

from aniket_tools import ExceptionHandler

try:
    result = db.query(...)
except Exception as exc:
    ExceptionHandler(exc)

handle_exception

Returns the standard error payload directly (as JSONResponse or dict).

from aniket_tools import handle_exception

response = handle_exception(ValueError("bad id"))
payload  = handle_exception(ValueError("bad id"), as_json_response=False)  # plain dict

# With request context (extracts request_id and path automatically)
response = handle_exception(exc, request=request, meta={"trace_id": "t-1"})

explain_error / get_status_code

Quick one-liners when you only need the message or the status code.

from aniket_tools import explain_error, get_status_code

msg    = explain_error(ValueError("bad input"))    # "The request data is invalid."
status = get_status_code(ValueError("bad input"))  # 400

Standard Response Shapes

Success with pagination

{
  "success": true,
  "response_code": 200,
  "meta": {"request_id": "req-1"},
  "data": [{"id": 1, "name": "Alice"}],
  "pagination": {"page": 1, "rows": 25, "total_rows": 250}
}

Validation error (422)

{
  "success": false,
  "response_code": 422,
  "error_message": "One or more fields are invalid.",
  "error_type": "RequestValidationError",
  "meta": {"request_id": "req-1", "path": "/users"},
  "error": {
    "code": "validation_error",
    "type": "RequestValidationError",
    "message": "One or more fields are invalid.",
    "details": [
      {"type": "missing", "field": "email", "message": "field required", "source": "body"},
      {"type": "missing", "field": "page",  "message": "field required", "source": "query"}
    ]
  },
  "errors": [
    {"type": "missing", "field": "email", "message": "field required", "source": "body"}
  ]
}

Retryable error (deadlock / timeout)

{
  "success": false,
  "response_code": 409,
  "error_message": "The database could not complete the operation because of a temporary concurrency conflict.",
  "error_type": "OperationalError",
  "meta": {},
  "error": {
    "code": "database_retryable_conflict",
    "type": "OperationalError",
    "message": "The database could not complete the operation because of a temporary concurrency conflict.",
    "retryable": true,
    "details": [{"type": "database_retryable_conflict", "message": "...", "retryable": true}]
  }
}

Code Structure

src/aniket_tools/
  __init__.py            ← public exports
  _compat.py             ← optional import helpers
  responses.py           ← create_response, value_correction, PaginationRes
  logging.py             ← logs, get_logger
  exceptions.py          ← ApiError, ErrorHandler, unified_exception_handler
  exception_handlers/
    base.py              ← ErrorInfo dataclass, message helpers
    api_http_validation.py   ← FastAPI/Starlette/Pydantic
    database_family.py       ← all SQL and MongoDB drivers
    http_auth_cloud_family.py ← requests, httpx, aiohttp, urllib3, PyJWT, botocore, gRPC, Elasticsearch
    cache_queue_family.py    ← Redis, Kafka, Celery, Kombu
    data_tool_family.py      ← Pandas, NumPy, PyArrow, Polars, SciPy
    python_family.py         ← stdlib builtins, asyncio, ssl, socket

Safe Editing Rules

  • Add specific exception checks before generic ones (e.g. redis.TimeoutError before Python TimeoutError)
  • Keep message logic in _database_message(...), status logic in _database_status(...)
  • Keep JSON shape logic in build_payload(...) or create_response(...)
  • If you add a new public function, also export it from __init__.py
  • create_response(...) is for normal route returns; unified_exception_handler(...) is for exceptions — they are separate code paths

HTML Output References

File Shows
examples/logging_results.html All logs() types and options with rendered output
examples/responses_results.html All create_response and value_correction use cases
examples/exceptions_results.html All ErrorHandler / ApiError exception families and payloads

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aniket_tools-0.1.4.tar.gz (41.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aniket_tools-0.1.4-py3-none-any.whl (36.0 kB view details)

Uploaded Python 3

File details

Details for the file aniket_tools-0.1.4.tar.gz.

File metadata

  • Download URL: aniket_tools-0.1.4.tar.gz
  • Upload date:
  • Size: 41.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for aniket_tools-0.1.4.tar.gz
Algorithm Hash digest
SHA256 e0bb3cefa89beaf1c5abf3486624f2472022fec8cd02b7fdad015eb2ab05f91d
MD5 f621fafd3f94a1059d0ddfbe135e8d48
BLAKE2b-256 22f114764a2b907429e89a42dce998ba5f869abc590b51641c0c4ff8b1b56d6c

See more details on using hashes here.

Provenance

The following attestation bundles were made for aniket_tools-0.1.4.tar.gz:

Publisher: publish-pypi.yml on aniketmodi123/reusable_code_lib

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file aniket_tools-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: aniket_tools-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 36.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for aniket_tools-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 bc2323b503eac30cae2d787ba0e9a112c32020ceabd37b08c93241e8565eacd9
MD5 a06929919c896bca4a6c8faaf9f4d82f
BLAKE2b-256 227f94052947880588e6d31faf3eef44c0c959ae3599bda3576e17c4610fb8d0

See more details on using hashes here.

Provenance

The following attestation bundles were made for aniket_tools-0.1.4-py3-none-any.whl:

Publisher: publish-pypi.yml on aniketmodi123/reusable_code_lib

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page