Universal data object model for SQL, NoSQL, graph, vector, and AI databases.
Project description
DBDuck
Universal Data Object Model (UDOM) for SQL and NoSQL.
DBDuck gives one API for data operations across engines.
Current Stage
- Stable focus:
SQL+NoSQL (MongoDB) - Next phase: Graph, AI, Vector
Supported Backends
Current production-capable backends in DBDuck:
SQLiteMySQLPostgreSQLSQL ServerMongoDB
These are the current officially supported real backends for DBDuck core UDOM workflows.
Backend names that may appear in config but are not yet production-complete should be treated as planned or experimental.
Install
pip install .
# for tests and tooling
pip install .[dev]
# for MongoDB support
pip install .[mongo]
# for SQL Server support
pip install .[mssql]
# install all optional backend extras
pip install .[all]
Backend Hardening Config
Runtime behavior is environment-configurable with secure defaults.
- See
.env.examplefor all settings. - Sensitive deployment values should come from your secret manager.
Key options:
DBDUCK_SQL_POOL_SIZE,DBDUCK_SQL_MAX_OVERFLOWDBDUCK_MONGO_MAX_POOL_SIZE,DBDUCK_MONGO_CONNECT_TIMEOUT_MSDBDUCK_MONGO_RETRY_ATTEMPTS,DBDUCK_MONGO_RETRY_BACKOFF_MSDBDUCK_ALLOW_UNSAFE_WHERE_STRINGS=false(recommended for production)DBDUCK_HASH_SENSITIVE_FIELDS=trueDBDUCK_BCRYPT_ROUNDS=12DBDUCK_SECURITY_AUDIT_ENABLED=trueDBDUCK_SECURITY_AUDIT_ENTITY=security_logsDBDUCK_RATE_LIMIT_ENABLED=falseDBDUCK_RATE_LIMIT_MAX_REQUESTS=60DBDUCK_RATE_LIMIT_WINDOW_SECONDS=60
Quick Start
from DBDuck import UDOM
# SQL (MySQL / PostgreSQL / SQLite)
db = UDOM(db_type="sql", db_instance="mysql", url="mysql+pymysql://user:pass@localhost:3306/udom")
db.create("Orders", {"order_id": 101, "customer": "A", "paid": True})
print(db.find("Orders", where={"paid": True}, limit=10))
# Explicit transactions
db.begin()
db.create("Orders", {"order_id": 102, "customer": "B", "paid": False})
db.commit()
# Transaction context manager
with db.transaction():
db.create("Orders", {"order_id": 103, "customer": "C", "paid": True})
# NoSQL (MongoDB)
nosql_db = UDOM(db_type="nosql", db_instance="mongodb", url="mongodb://localhost:27017/udom")
print(nosql_db.execute("ping"))
print(nosql_db.create("events", {"type": "login", "ok": True}))
print(nosql_db.find("events", where={"ok": True}))
print(
db.aggregate(
"Orders",
group_by="paid",
metrics={"total_orders": "count(*)"},
order_by="paid DESC",
)
)
# BCrypt secret verification
db.create("users", {"id": 1, "username": "veeresh", "password": "plain-secret"})
user = db.find("users", where={"id": 1})[0]
assert db.verify_secret("plain-secret", user["password"]) is True
Model-level sensitive fields:
from DBDuck.udom.models.umodel import UModel
class Member(UModel):
__sensitive_fields__ = ["pin"]
id: int
username: str
pin: str
Core API
create(entity, data)create_many(entity, rows)find(entity, where=None, order_by=None, limit=None)find_page(entity, page=1, page_size=20, where=None, order_by=None)delete(entity, where)update(entity, data, where)count(entity, where=None)aggregate(entity, group_by=None, metrics=None, where=None, having=None, order_by=None, limit=None, pipeline=None)execute(native_query)uquery(uql)uexecute(uql)begin()commit()rollback()transaction()ping()close()ensure_indexes(entity, indexes)(NoSQL/Mongo)
Production Architecture
DBDuck/
core/
adapter_router.py
base_adapter.py
connection_manager.py
exceptions.py
mongo_connection_manager.py
schema.py
transaction.py
adapters/
mysql_adapter.py
mssql_adapter.py
postgres_adapter.py
sqlite_adapter.py
udom/
udom.py
utils/
logger.py
Design highlights:
- Adapter pattern keeps backend-specific logic out of
UDOM. - SQL adapters use SQLAlchemy with parameterized execution and connection pooling.
- MongoDB NoSQL adapter supports pooled client management, safe filter parsing, and transactions.
ConnectionManagerprovides lazy, thread-safe engine/session reuse.- Structured logging captures query, error, and connection events.
Recent Changes
- Enforced full
BaseAdapterabstract contract for all adapters. - Added adapter auto-router for SQL dialect selection from
db_instance/ URL. - Added thread-safe SQL and Mongo connection managers with lifecycle cleanup.
- Added transaction safety for SQL + Mongo (
begin,commit,rollback,transaction). - Added centralized schema validation for
create/find/delete. - Added stronger injection defenses:
- SQL string
whereparsing + parameter binding. - Mongo filter parsing with unsafe token rejection.
- SQL string
- Added automatic BCrypt hashing for sensitive fields like
password,secret, and token fields. - Added security audit trail persistence to
security_logsfor blocked injection attempts and rate-limit events. - Added in-memory rate limiting controls for UDOM operations.
- Added custom exception mapping across SQL + Mongo:
DatabaseError,ConnectionError,QueryError,TransactionError.
- Added structured logging for connection/query/transaction events and errors.
- Added masked execution errors for SQL and Mongo so raw driver/database details are not exposed to callers.
- Added batch operations (
create_many) for SQL + Mongo. - Added health/lifecycle methods:
ping()andclose(). - Added
verify_secret(...)for BCrypt password/secret verification. - Added
UModel.__sensitive_fields__for model-level sensitive field hashing. - Added real backend integration test scaffolding for
MySQL,PostgreSQL,SQL Server, andMongoDB. - Added native backend pagination support for SQL and Mongo-backed
find_page(). - Added test coverage for routing, transactions, validation, error handling, hashing, audit logs, rate limiting, and integration scaffolding.
CI/CD (Tests)
GitHub Actions workflow is included at:
.github/workflows/ci.yml
It runs on push and pull requests:
- Python
3.10,3.11,3.12 pip install .[dev]pytest -qpip-audit --descbandit -q -r DBDuck -c .bandit
Real backend integration tests are available under tests/integration for:
mongodbmysqlpostgresqlmssql
They are opt-in via environment flags so the default suite stays local and deterministic. The integration suite now covers CRUD, transaction commit/rollback, native pagination, and connection-failure mapping for the current production backends.
SQL Migration Baseline
Alembic baseline scaffold is included:
migrations/sql/alembic.inimigrations/sql/env.pymigrations/sql/versions/
Usage:
alembic -c migrations/sql/alembic.ini revision -m "init"
alembic -c migrations/sql/alembic.ini upgrade head
Mongo Indexes
db.ensure_indexes(
"events",
[
{
"fields": [{"name": "type", "order": "asc"}, {"name": "ts", "order": "desc"}],
"options": {"name": "idx_type_ts"},
}
],
)
Model-driven indexes:
from DBDuck.udom.models.umodel import UModel
class Event(UModel):
__collection__ = "events"
__indexes__ = [
{"fields": [{"name": "type", "order": "asc"}], "options": {"name": "idx_type"}},
]
Event.bind(db)
Event.ensure_indexes()
Production Readiness Snapshot
- Current readiness estimate for current real backends: 88%
- Coverage now includes robust SQL + Mongo core operations, security controls, and real backend integration scaffolding for
MySQL,PostgreSQL,SQL Server, andMongoDB. - Remaining work for higher confidence:
- Migrations and schema evolution strategy.
- Full real-backend integration execution in CI infrastructure.
- Observability dashboards/alerts and SLOs.
- Performance/load testing with real infra.
- Release/versioning policy and backend compatibility matrix.
Initialize Guide
See docs/INITIALIZE.md for full initialization steps.
Logo
Place your logo file here:
docs/assets/dbduck-logo.png
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dbduck-0.1.0.tar.gz.
File metadata
- Download URL: dbduck-0.1.0.tar.gz
- Upload date:
- Size: 56.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
809ea0a3bef94c378566b3d05d3ad58307d0f610748d3d598cb321175d243b56
|
|
| MD5 |
ea95c6a3a86cdc6ec4d53372d1302342
|
|
| BLAKE2b-256 |
a0fb4136bd5d524c3421d94e294abc11e350a36e7e7c535f2cd1c2a601e4cc5c
|
File details
Details for the file dbduck-0.1.0-py3-none-any.whl.
File metadata
- Download URL: dbduck-0.1.0-py3-none-any.whl
- Upload date:
- Size: 58.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
57ebb1470729c845990a17ab96ffb64ea9fc460f69cf4c77b16bf008241ee16f
|
|
| MD5 |
eac34ef33ae9b349a74a7cc7d95f8c49
|
|
| BLAKE2b-256 |
cc669bf38d03c2b51c7bf0cbd6e299a2dda284d5767634f95842a2116a3bbf59
|