Skip to main content

SQLAlchemy integration for HawkAPI — async sessions, multi-database routing, Alembic helpers, pytest fixtures

Project description

hawkapi-sqlalchemy

SQLAlchemy integration for HawkAPI. Async sessions, multi-database routing (primary/replica/shards), Alembic helpers, and pytest fixtures.

Install

pip install hawkapi-sqlalchemy                     # SQLite included
pip install 'hawkapi-sqlalchemy[postgres]'         # + asyncpg
pip install 'hawkapi-sqlalchemy[mysql]'            # + aiomysql

Quickstart

from hawkapi import Depends, HawkAPI
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import Mapped, mapped_column

from hawkapi_sqlalchemy import Base, TimestampMixin, get_session, init_database


class User(Base, TimestampMixin):
    __tablename__ = "users"
    id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
    email: Mapped[str] = mapped_column(unique=True)


app = HawkAPI()
init_database(app, url="postgresql+asyncpg://user:pw@localhost/app")


@app.post("/users")
async def create(email: str, sess: AsyncSession = Depends(get_session)):
    sess.add(User(email=email))
    await sess.flush()
    return {"ok": True}

The get_session dependency opens a fresh session per request, commits on success, and rolls back on exception — no boilerplate.

Multiple databases

from hawkapi_sqlalchemy import DatabaseConfig, init_database, session_for

init_database(
    app,
    databases={
        "primary": DatabaseConfig(url="postgresql+asyncpg://…/primary"),
        "replica": DatabaseConfig(url="postgresql+asyncpg://…/replica"),
        "analytics": DatabaseConfig(url="postgresql+asyncpg://…/analytics"),
    },
)

# DI helpers:
from hawkapi_sqlalchemy import get_session, get_replica_session

get_analytics = session_for("analytics", commit=False)


@app.get("/report")
async def report(sess: AsyncSession = Depends(get_analytics)):
    ...

get_replica_session falls back to primary if no replica is registered, so you can switch on without changing handlers.

Mixins

from hawkapi_sqlalchemy import Base, TimestampMixin, UUIDMixin


class Doc(Base, UUIDMixin, TimestampMixin):
    __tablename__ = "docs"
    title: Mapped[str] = mapped_column()
  • TimestampMixincreated_at / updated_at with DB-side defaults + Python onupdate.
  • UUIDMixin — string id column with a uuid4() default.
  • Prefer DataclassBase over Base to get SQLAlchemy 2.0's dataclass-style declarative.

Alembic

In your alembic/env.py:

from hawkapi_sqlalchemy.alembic import run_migrations
from myapp.db import Base, settings  # your Base + URL

run_migrations(target_metadata=Base.metadata, url=settings.database_url)

That's it — handles both online (live connection) and offline (--sql) modes; uses NullPool for migrations; enables render_as_batch=True automatically for SQLite.

Healthchecks

from hawkapi_sqlalchemy import all_healthy


@app.get("/healthz")
async def healthz():
    return await all_healthy(app.state.db)

Returns {"primary": True, "replica": True, ...}.

Testing

import pytest
from hawkapi_sqlalchemy import Base, temporary_database


@pytest.fixture
async def db():
    async with temporary_database(Base.metadata) as database:
        yield database


async def test_something(db):
    async with db.session() as sess:
        ...

temporary_database creates an in-memory SQLite engine, calls Base.metadata.create_all, yields, then drops the schema and disposes.

DatabaseConfig

DatabaseConfig(
    url="postgresql+asyncpg://…",
    echo=False,
    pool_size=5,
    max_overflow=10,
    pool_timeout=30.0,
    pool_recycle=3600,
    pool_pre_ping=True,
    connect_args={"server_settings": {"jit": "off"}},
    engine_kwargs={...},     # forwarded to create_async_engine
    session_kwargs={...},    # forwarded to async_sessionmaker
)

Development

git clone https://github.com/ashimov/hawkapi-sqlalchemy.git
cd hawkapi-sqlalchemy
uv sync --extra dev
uv run pytest -q
uv run ruff check . && uv run ruff format --check .
uv run pyright src/

License

MIT.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hawkapi_sqlalchemy-0.2.0.tar.gz (40.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hawkapi_sqlalchemy-0.2.0-py3-none-any.whl (13.1 kB view details)

Uploaded Python 3

File details

Details for the file hawkapi_sqlalchemy-0.2.0.tar.gz.

File metadata

  • Download URL: hawkapi_sqlalchemy-0.2.0.tar.gz
  • Upload date:
  • Size: 40.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hawkapi_sqlalchemy-0.2.0.tar.gz
Algorithm Hash digest
SHA256 c64395a0d7a74960ed5ab85ec05e7092cfba3cb2601d163f12e8ef0747e8615b
MD5 3628a54b52692b7d74aec54974f842e5
BLAKE2b-256 46c34e2611bb39f8a1e4d143fddc95f90e5cbbfa6ccacc50b569ede4c01e7317

See more details on using hashes here.

Provenance

The following attestation bundles were made for hawkapi_sqlalchemy-0.2.0.tar.gz:

Publisher: release.yml on ashimov/hawkapi-sqlalchemy

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hawkapi_sqlalchemy-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for hawkapi_sqlalchemy-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 13c4576d32e16321dff14b976dea29b26fa1ed491cfb1efb975050a6dc849046
MD5 01e1e1f6bef5d5aee161ac6cbd807a66
BLAKE2b-256 cea42b5dd5147efc7ed50bc36be096c6a040c1cba11ab70997cb678fc39233b8

See more details on using hashes here.

Provenance

The following attestation bundles were made for hawkapi_sqlalchemy-0.2.0-py3-none-any.whl:

Publisher: release.yml on ashimov/hawkapi-sqlalchemy

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page