Modern Python 3.11+ Framework: Flask/FastAPI orchestration, and strict runtime enforcement.
Project description
py_aide
Modern Python 3.11+ Framework: Flask/FastAPI Orchestration & Strict Runtime Enforcement
py_aide is a robust, developer-centric framework designed to bring safety, structure, and consistency to Python web applications. It provides a unique "Strict Runtime Enforcement" layer that ensures your code remains clean, documented, and type-safe at execution time.
🚀 Key Features
- Strict Runtime Enforcement: A powerful decorator that enforces type hints, docstrings, and calling conventions (positional-only/keyword-only) at runtime.
- Unified Server Portal: Seamlessly orchestrate Flask and FastAPI applications with shared security gates and auto-discovery of routes.
- WebSocket Excellence: High-performance, identity-aware WebSockets with support for User and Group-based messaging, automatic handshake mapping, and tiered delivery.
- Thread-Safe SQL: A thread-level multiton SQLite manager with automatic JSON serialization, transaction tracking, and schema management.
- Modern Security: First-class support for Bearer tokens, API keys, and Fernet-based encryption.
- Enterprise HTTP Client: A standardized, "always resolve" HTTP client with built-in retries, interceptors, and dual sync/async support.
- Rich Utilities: Built-in handlers for images (base64/files), dates (aware/naive conversions), and custom data structures.
📦 Installation
pip install py_aide
Note:
py_aiderequires Python 3.11+ and is currently optimized for Linux environments.
⚙️ Core Philosophy: Strict Enforcement
At the heart of py_aide is the @enforce_requirements decorator. It's designed to prevent "sloppy" code by failing early if:
- A function is missing a docstring.
- A parameter or return value is missing a type hint.
- A function uses more than 8 arguments (promoting better decomposition).
- Calling conventions (
/or*) are not explicitly defined.
from py_aide.enforcer import enforce_requirements
@enforce_requirements
def create_user(name: str, age: int, /) -> dict:
"""Creates a new user dict."""
return {"name": name, "age": age}
🌐 Unified Server Example (Flask)
from py_aide.servers.flask import ServicePortal, GateConfig
portal = ServicePortal()
@portal.endpoint("/api/greet", gate=GateConfig(auth_required=False))
def greet(name: str, /) -> dict:
"""Returns a greeting message."""
return {"message": f"Hello, {name}!"}
if __name__ == "__main__":
portal.run(port=5000)
📡 Standardized HTTP Client (requests)
py_aide provides a resilient HTTP client designed to work across both Flask and FastAPI. It follows an "Always Resolve" philosophy: it never raises exceptions for network errors or HTTP failures. Instead, it returns a standardized Response object.
Core Features:
- Resilient: Automatic retries with
exponential,linear, orfixedbackoff. - Secure: Auto-injects Bearer tokens and sanitizes outgoing payloads.
- Silent Mode: Suppress internal framework logging for specific requests.
- Middleware: Register global
requestandresponseinterceptors. - Dual-Mode: Dedicated paths for Synchronous (Flask) and Asynchronous (FastAPI) logic.
1. Synchronous Usage (Flask)
Ideal for standard routes where you want simple, linear logic.
from py_aide import send_request, RequestConfig
# Optional: Set global defaults
RequestConfig.base_url = "https://api.myapp.com"
RequestConfig.max_retries = 3
def get_data():
# Parameters are now keyword-only for strictness
res = send_request(
method="GET",
endpoint="/external-data",
headers={"X-Custom-Header": "Value"}, # Custom headers
silent=True, # Suppress all framework logs for this request
on_success=lambda data: print("Got data!"),
on_error=lambda err: print(f"Error: {err}")
)
if res:
print(f"Success: {res.data}")
else:
print(f"Failed: {res.log}") # Detailed error log
2. Asynchronous Usage (FastAPI)
Optimized for high-concurrency environments using asyncio.
from py_aide import send_request_async
async def fetch_profile(user_token: str):
# Automatically injects Bearer token
res = await send_request_async(
method="GET",
endpoint="/profile",
auth_token=user_token
)
return res.to_dict()
3. Interceptors (Middleware)
Globally modify requests before they leave or responses before they reach your logic.
from py_aide import Interceptors
# Add a header to every outgoing request
Interceptors.add_request_hook(lambda data: {
**data,
"headers": {**data["headers"], "X-Client-ID": "py-aide-v1"}
})
# Log or transform every response
def log_response(res):
print(f"Response from {res.errorLogs.get('endpoint')}")
return res
Interceptors.add_response_hook(log_response)
4. Authentication Strategies
py_aide handles common auth patterns out-of-the-box.
# 1. Bearer Token (Mandatory auth_type)
send_request(method="GET", endpoint="/", auth_token="my-token", auth_type="bearer")
# 2. API Key (Uses X-API-Key header by default)
send_request(method="GET", endpoint="/", auth_token="sk_123", auth_type="api_key")
# 3. Basic Auth (Automatically Base64 encodes)
send_request(method="GET", endpoint="/", auth_token=("user", "pass"), auth_type="basic")
# Custom API Key Header
RequestConfig.api_key_header = "X-My-Custom-Auth"
📥 Persistent Buffering
py_aide provides two ways to offload work to the background. Choosing the right one is critical for performance and data integrity.
🏁 Choosing the Right Buffer
| Feature | DatabaseBuffer |
TaskBuffer |
|---|---|---|
| Best For | Simple, high-frequency writes | Complex logic & transactions |
| Connection | Fresh connection per task | One connection for entire logic |
| State | No state between tasks | Supports ATTACH DATABASE & Temp Tables |
| Flexibility | Standard SQL methods only | Any Python code |
📦 Database Write Buffer (SQLite Lock Prevention)
For high-concurrency write operations, py_aide provides a DatabaseBuffer. It serializes writes to prevent "Database is Locked" errors.
from py_aide import DatabaseBuffer
# 1. Initialize with your main DB path (Defaults to _py_aide_queues.db for storage)
buffer = DatabaseBuffer(main_db="app.db")
# 2. Start the background worker
buffer.start()
# 3. Queue writes (non-blocking)
buffer.insert(table="users", data=[{"name": "Alice"}])
buffer.update(table="users", columns=["name"], column_data=["Bob"], condition="id=?", condition_data=[1])
# 4. Graceful shutdown
buffer.stop()
[!WARNING] State Limitation:
buffer.execute()is for single-statement custom SQL. Because the background worker opens a fresh connection for every task, connection-level state (likeATTACH DATABASEorPRAGMAsettings) will not persist between separate calls. For stateful or multi-step transactions, always use theTaskBuffer.
⚙️ Generic Task Buffer (Async Jobs)
The TaskBuffer allows you to run any arbitrary function in the background. Ideal for emails, webhooks, or file processing.
from py_aide import TaskBuffer
# 1. Define your handlers
def send_email(payload):
# logic to send email to payload['to']
print(f"Email sent to {payload['to']}")
# 2. Initialize with handlers and 5 parallel workers (Defaults to _py_aide_queues.db)
tasks = TaskBuffer(
handlers={"SEND_EMAIL": send_email},
workers=5
)
tasks.start()
# 3. Push tasks from anywhere
tasks.push(task_type="SEND_EMAIL", data={"to": "user@example.com"})
# 💡 Pro Tip: Automatic Retries
# If a task fails, py_aide automatically retries it with
# Exponential Backoff (1s, 2s, 4s, 8s...).
# 💡 Pro Tip: Maintenance
# Keep your queue database slim by purging old failed tasks:
tasks.cleanup(max_age_days=7)
# 💡 Pro Tip: Use TaskBuffer for Complex SQL Transactions
# Since you control the 'with Api' block, you can handle multiple tables
# or even ATTACH DATABASE which requires a persistent connection.
def sync_to_archive(payload):
with Api(db_path="main.db", transactionMode=True) as db:
db.execute(query="ATTACH DATABASE ? AS archive", data=["archive.db"])
db.execute(query="INSERT INTO archive.logs SELECT * FROM main.logs WHERE id = ?", data=[payload['id']])
return db.commit_transaction()
# 4. Graceful shutdown
tasks.stop()
🚨 Error Hooks & Monitoring
Since buffers run in the background, you can register an on_error hook to be notified of failures. This is ideal for logging full Tracebacks to your database or alerting systems.
# The hook receives the task type, the data, the error message, and the Traceback
def my_error_logger(task_type, payload, error, traceback):
if traceback:
# 'traceback' contains the full Python stack trace string
print(f"CRASH in {task_type}: {traceback}")
else:
print(f"LOGIC ERROR in {task_type}: {error}")
# Register the hook in the constructor
tasks = TaskBuffer(
handlers={"EMAIL": send_email},
on_error=my_error_logger
)
🧵 Global Hooks in ThreadPoolExecutor
The ThreadPoolExecutor also supports a global on_error hook to catch failures in parallel tasks.
def global_handler(func_name, payload, error, traceback):
print(f"Parallel task {func_name} failed: {error}")
with ThreadPoolExecutor(on_error=global_handler) as executor:
executor.submit(func=heavy_task, args=[data])
🗓️ Scheduler Hooks
The scheduler functions (run_once, run_every, run_at) are strictly keyword-only and support on_error hooks.
from py_aide.threading import run_every
def scheduler_error_handler(name, type, error, traceback):
print(f"Scheduled task {name} ({type}) failed: {error}")
# Mandatory keyword arguments: interval_seconds and func
run_every(interval_seconds=3600, func=sync_data, on_error=scheduler_error_handler)
📦 Persistent Queuing
py_aide includes a robust, SQLite-backed persistent queue for tasks that must survive application restarts.
from py_aide import PersistentQueue
# 1. Initialize (Defaults to _py_aide_queues.db)
queue = PersistentQueue(queue_name="email_tasks")
# 2. Push a task (Keyword arguments mandatory)
queue.push(payload={"to": "user@example.com", "body": "Hello!"}, priority=10)
# 3. Pop and Process
res = queue.pop()
if res:
task = res.data[0]
task_id = task["id"]
payload = task["payload"]
try:
# Process task...
queue.complete(task_id)
except Exception as e:
# Record failure with error message
queue.fail(task_id=task_id, err=str(e), retry=True)
# 4. Peek without locking
upcoming = queue.peek(limit=5)
⚡ Real-Time Identity & Groups (WebSockets)
py_aide moves beyond anonymous broadcasts. It allows you to target users and groups directly using their business IDs (e.g., userId), handling the underlying session mapping automatically. This powerful API is unified across both FastAPI and Flask.
Unified Delivery Methods
Both FastAPIServicePortal and ServicePortal (Flask) expose identical methods for targeted communication:
send_to_user(user_id, event, data): Reach all active sessions of a specific user.send_to_group(group_id, event, data): Sync messages within a room or collaborative group.broadcast(event, data): Send a message to every connected client.
Flask Example
from py_aide.servers.flask import ServicePortal
from py_aide.servers.gate import AuthType
portal = ServicePortal(enable_websocket=True)
@portal.on_event("join_team", auth_required=True, auth_type=AuthType.BEARER)
def handle_join(data: dict):
"""Adds the user to a collaborative group."""
team_id = data.get("teamId")
# Identity mapping is handled automatically upon successful event auth
portal.join_group(team_id)
return {"status": "success", "team": team_id}
# Sending targeted messages from anywhere (even sync contexts)
def notify_user(user_id: str, message: str):
portal.send_to_user(user_id, "notification", {"text": message})
FastAPI Example
from py_aide.servers.fastApi import FastAPIServicePortal
portal = FastAPIServicePortal(enable_websocket=True)
@portal.on_event("join_team", auth_required=True)
async def handle_join(data: dict, sid: str):
team_id = data.get("teamId")
await portal.join_group(sid, team_id)
return {"status": "success"}
async def sync_team(team_id: str, update: dict):
await portal.send_to_group(team_id, "team_update", update)
🛡️ Public API Security & Hardening
py_aide is built for external-facing APIs. It provides a robust, production-ready security pipeline that is unified across Flask and FastAPI.
🌐 Unified CORS Orchestration
CORS is handled identically on both frameworks. You can define allowed origins in the constructor to prevent unauthorized cross-origin requests.
# Works for both ServicePortal (Flask) and FastAPIServicePortal
portal = ServicePortal(
cors_origins=["https://myapp.com", "https://api.myapp.com"],
debug=False
)
🚦 Rate Limiting (Atomic & Precise)
Protect your server from brute force and DoS attacks by enforcing request limits. py_aide uses atomic operations to prevent race conditions.
from py_aide.servers.gate import GateConfig
@portal.endpoint("/api/search", gate=GateConfig(
auth_required=False,
rate_limit="ip", # Options: "user", "ip", "endpoint"
rate_limit_value=10, # 10 requests
rate_limit_window=60 # per 60 seconds
))
def search_api(query: str) -> dict:
return {"results": []}
🔑 Fine-Grained Permissions (Scopes)
Move beyond simple Roles to granular, OAuth2-style Scopes. You can enforce multiple required scopes for sensitive endpoints.
@portal.endpoint("/api/admin/delete", gate=GateConfig(
auth_required=True,
required_scopes=["admin:write", "system:purge"]
))
def delete_item() -> dict:
"""Only succeeds if the user's token has BOTH scopes."""
return {"status": True}
🔄 Identity Symmetrics (Token Handover)
py_aide automatically handles the transformation between raw database IDs and masked session tokens. If you define a payload_auth_token_key, the framework will:
- Unmask incoming tokens so your handler sees the raw ID (e.g.
1234). - Re-mask the ID in the response so the client only sees the token (e.g.
tok_abc...).
@portal.endpoint("/login", gate=GateConfig(
auth_required=False,
payload_auth_token_key="userId" # The field to mask/unmask
))
def login_user() -> dict:
# Framework sees 'userId': 1234
# Client receives 'userId': 'eyJhbGciOiJIUzI1...'
return {"status": True, "data": {"userId": 1234}}
🚪 Explicit Logout (Token Revocation)
py_aide supports immediate session invalidation. Revoked tokens are stored in a hybrid memory/SQLite store (_py_aide_security.db) and are rejected even if their signature is valid.
@portal.endpoint("/logout", auth_required=True)
def logout() -> dict:
# Get the raw token from the request
token = request.headers.get("Authorization").split(" ")[1]
# Invalidate immediately
portal.revoke_token(token)
return {"status": True, "message": "Logged out"}
🗄️ Database Management
py_aide provides a thread-local multiton pattern for SQLite, ensuring each thread has its own connection while sharing the same configuration.
from py_aide.database import Api
db_config = {
'users': 'id INTEGER PRIMARY KEY, name TEXT, meta JSON'
}
with Api(db_path="data.db", tables=db_config) as db:
db.insert(table="users", data=[(1, "Alice", {"role": "admin"})])
result = db.fetch(table="users", columns=["name", "meta::role as role"])
print(result.data) # [{'name': 'Alice', 'role': 'admin'}]
⚠️ Important Note: Eventlet Monkey Patching
By default, importing src.py_aide (or the top-level package) immediately performs eventlet.monkey_patch(). This is required for reliable WebSocket support and some threading features. If you need to avoid this side-effect, ensure you understand the dependencies of your modules.
📄 License
This project is licensed under the MIT License. See the LICENSE file for more details.
👥 Authors
- Kakuru Douglas - vicaniddouglas@gmail.com
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file py_aide-0.17.0.tar.gz.
File metadata
- Download URL: py_aide-0.17.0.tar.gz
- Upload date:
- Size: 201.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0148bcab48e811df87c8f6921d145574c9a02cc3a3c3671acc88988533a68107
|
|
| MD5 |
0c06e08c61756e547acbee66ca713e45
|
|
| BLAKE2b-256 |
6df171ae6d3f67d1282b919d929cb8d0a24f94c13ec4b77511e0ccead9e744b2
|
File details
Details for the file py_aide-0.17.0-py3-none-any.whl.
File metadata
- Download URL: py_aide-0.17.0-py3-none-any.whl
- Upload date:
- Size: 198.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
867922d68e17680a47f304c19374a4c8a16316c408779657e0bf99a932331641
|
|
| MD5 |
edc97412f78a3290dc8711186d9c7c0f
|
|
| BLAKE2b-256 |
b50d3a1d2af357a284bd6c53ee5baa509a9ca8e02ff57309b5932692febfc0de
|