Skip to main content

Apache Airflow Alembic provider containing Operators & Hooks.

Project description

Alembic Airflow Provider

An Airflow Provider to use Alembic to manage database migrations Read more here

Setup

Locally

Install the Alembic CLI with pip install alembic

In Airflow

Add airflow-provider-alembic to your requirements.txt or equivalent

Usage

  • Create the required files for Alembic in either your dags folder or the include folder
mkdir dags/migrations
cd dags/migrations
alembic init .
  • Create a revision
alembic revision -m "My Database Revision"
  • Edit the revision - adding, modifying, or removing objects as needed
...

def upgrade():
    # Use ORM to create objects
    op.create_table(
        'foo',
        sa.Column('id', sa.Integer, primary_key=True),
        sa.Column('name', sa.String(50), nullable=False),
        sa.Column('description', sa.Unicode(200)),
    )
    # Or run raw SQL
    op.execute("SELECT 1;")


def downgrade():
    # Specify the opposite of your upgrade, to rollback
    op.drop_table('account')
  • Add a Connection to Airflow For demo purposes, we will add an in-memory SQLite3 Connection named sqlite via our .env file:
AIRFLOW_CONN_SQLITE="sqlite:///:memory:"
  • Restart (or start) your project with astro dev restart
  • Add a DAG, to run your revision. Because this has @once, it will run as soon as the DAG is turned on. Future runs for future revisions will need to be triggered.
import os
from datetime import datetime

from airflow.models import DAG
from airflow.models.param import Param

from airflow_provider_alembic.operators.alembic import AlembicOperator

with DAG(
        "example_alembic",
        schedule="@once",  # also consider "None"
        start_date=datetime(1970, 1, 1),
        params={
            "command": Param("upgrade"),
            "revision": Param("head")
        }
) as dag:
    AlembicOperator(
        task_id="alembic_op",
        conn_id="sqlite",
        command="{{ params.command }}",
        revision="{{ params.revision }}",
        script_location="/usr/local/airflow/dags/migrations",
    )

Extra Capabilities

  • You can utilize any of the Alembic commands in the AlembicOperator - such as downgrade
  • The AlembicHook has methods to run any alembic commands

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow-provider-alembic-1.0.0.tar.gz (4.7 kB view details)

Uploaded Source

Built Distribution

airflow_provider_alembic-1.0.0-py3-none-any.whl (5.4 kB view details)

Uploaded Python 3

File details

Details for the file airflow-provider-alembic-1.0.0.tar.gz.

File metadata

File hashes

Hashes for airflow-provider-alembic-1.0.0.tar.gz
Algorithm Hash digest
SHA256 1048b583e7ce01a41260ab6e15494e0f2422a62d7b7728003611ef9735b3dbbb
MD5 acdb8672e01116d4a9bfe542f5b7fc12
BLAKE2b-256 017e90f5032a8236080fc16221049cd8535787abd471d83734f0cbaaa16d7e68

See more details on using hashes here.

File details

Details for the file airflow_provider_alembic-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_provider_alembic-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 789ab271d38e4e60b7f6ced084da748497d391d28fb260bace39f1ab778d4c6b
MD5 25dd48196b0a4a13197a9e8dc8ed6e37
BLAKE2b-256 8a5b6b645a9268ed0b8a4a5f8fa6b1d1c3ef10856e023f849b11d3de0f25a224

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page