Skip to main content

An applipy library for working with PostgreSQL.

Project description

pipeline status coverage report PyPI Status PyPI Version PyPI Python PyPI License PyPI Format

Applipy PostgreSQL

An applipy library for working with PostgreSQL.

It lets you declare connections in the configuration of your application that get turned into postgres connection pools that can be accessed by declaring the dependency in your classes.

The connection pools are created the first time they are used and closed on application shutdown.

Usage

You can define connections to databases in you application config file:

# dev.yaml
app:
  name: demo
  modules:
  - applipy_pg.PgModule

pg:
  connections:
  # Defines an anonimous db connection pool
  - user: username
    host: mydb.local
    port: 5432
    dbname: demo
    password: $3cr37
  # Defines an db connection pool with name "db2"
  # which is also aliased to names "db3" and "db4"
  - name: db2
    user: username
    host: mydb.local
    port: 5432
    dbname: demo
    password: $3cr37
    aliases: [db3, db4]

The configuration definition above defines two database connection pools. These can be accessed through applipy's dependency injection system:

from applipy_pg import PgPool

class DoSomethingOnDb:
    def __init__(self, pool: PgPool) -> None:
        self._pool = pool

    async def do_something(self) -> None:
        async with self.pool.cursor() as cur:
            # cur is a aiopg.Cursor
            await cur.execute('SELECT 1')
            await cur.fetchone()

from typing import Annotated
from applipy_inject import name

class DoSomethingOnDb2:
    def __init__(self, pool: Annotated[PgPool, name('db2')]) -> None:
        self._pool = pool

    async def do_something(self) -> None:
        async with self.pool.cursor() as cur:
            # cur is a aiopg.Cursor
            await cur.execute('SELECT 2')
            await cur.fetchone()

Aliased pools can also be accessed using their aliases:

from typing import Annotated
from applipy_inject import name

class DoSomethingOnDb2:
    def __init__(
        self,
        pool2: Annotated[PgPool, name('db2')],
        pool4: Annotated[PgPool, name('db4')],
    ) -> None:
        assert pool2 is pool4

The aiopg.Pool instance can be accessed using the PgPool.pool() method.

Each connection pool can be further configured by setting a config attribute with a dict containing the extra paramenters to be passed to aiopg.create_pool():

pg:
  connections:
  - user: username
    host: mydb.local
    port: 5432
    dbname: demo
    password: $3cr37
    config:
      minsize: 5
      timeout: 100.0

You can also define a global configuration that will serve as a base to all database connections defined by setting pg.global_config.

pg:
  global_config:
    minsize: 5
    timeout: 100.0
  connections:
  # ...

Migrations

This library also includes a migrations functionality. How to use it:

First, define your migrations:

class DemoMigrationSubject_20240101(PgClassNameMigration):
    def __init__(self, pool: PgPool) -> None:
        self._pool = pool  # Import whatever resources you need

    async def migrate(self) -> None:
        # Do you migrations...
        async with self._pool.cursor() as cur:
            ...

If you want more control over how the version and subject of the migration is defined, you can extend PgMigration and implement your own logic.

Then, create your migrations module:

class MyMigrationsModule(Module):
    def configure(self, bind: BindFunction, register: RegisterFunction) -> None:
        bind(PgMigration, DemoMigrationSubject_20240101)

    @classmethod
    def depends_on(cls) -> tuple[type[Module], ...]:
        return PgMigrationsModule,

Finally, you can optionally set the name of the connection to use for the migrations audit table. This table is used to know what migrations have been run and which migrations should be ran:

pg:
  connections:
  # Defines an db connection pool with name "db2"
  - name: db2
    # ...
  migrations:
    # sets the connection named "db2" as the connection to use for the
    # migrations audit table
    connection: db2

Loading and Performing Migrations

To load your migrations in the application you can:

  1. Bind them in an Applipy module to the type PgMigration, using the injector
  2. Have them all be part of a Python module and set the config pg.migrations.modules to a list of strings containing the modules containing migration classes.

Then, just include the module applipy_pg.PgMigrationsModule somewhere in your app, i.e. in the config file and your migrations will be run during the on_init step of your application's lifecycle.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

applipy_pg-0.3.0.tar.gz (10.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

applipy_pg-0.3.0-py3-none-any.whl (11.2 kB view details)

Uploaded Python 3

File details

Details for the file applipy_pg-0.3.0.tar.gz.

File metadata

  • Download URL: applipy_pg-0.3.0.tar.gz
  • Upload date:
  • Size: 10.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.1

File hashes

Hashes for applipy_pg-0.3.0.tar.gz
Algorithm Hash digest
SHA256 ca5339186c3787c65e4367a76c872a5c5647718194160601a1e397292cc05ef0
MD5 0de4018efcd113dd00fd9ed05088f3f4
BLAKE2b-256 8e079788e7b530bb1131be75dbb88541e39cedd0e772a07b6290f18a297ece26

See more details on using hashes here.

File details

Details for the file applipy_pg-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: applipy_pg-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 11.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.1

File hashes

Hashes for applipy_pg-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 429bb000ec642da02cd2638aa193e47f81f4db3dafff7e8198410e58c328109b
MD5 f82707c701b9ddee947dea71ba975614
BLAKE2b-256 85e18e40b2754d46b2f73809a99b923615e8782e41daa2252f526b57857afbf5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page