Skip to main content

Migration library for Clickhouse

Project description

Clickhouse Migrator

Clickhouse is known for its scale to store and fetch large datasets.

Development and Maintenance of large-scale db systems many times requires constant changes to the actual DB system. Holding off the scripts to migrate these will be painful.

We found there is nothing existing earlier and developed one inspired by, Flyway, Alembic

This is a python library, which you can execute as a pre-hook using sys python. Or as a migration framework before deployment/server-startup in your application as required.

Publishing to pypi

  • python -m build
  • python -m twine upload --verbose --repository pypi dist/*

Installation

You can install from pypi using pip install clickhouse-migrator.

Usage

from migration_lib.migrate import migrate

migrate(db_name, migrations_home, db_host, db_user, db_password, create_db_if_no_exists)
Parameter Description Default
db_name Clickhouse database name None
migrations_home Path to list of migration files <project_root>
db_host Clickhouse database hostname localhost
db_password ***** ****
create_db_if_no_exists If the db_name is not present, enabling this will create the db True

Folder and Migration file patterns

The filenames are pretty similar to how flyway keeps it.

Your first version filename should be prefixed with V1__ (double underscore) These migrations are executed one by one, failures in between will stop and not further version files will be executed.

Multi statement and single statement migrations

If your migration is a single statement, you can create a file in the migration folder using the .sql extension and push your migration statement in there.

If you want to execute more than one statement in your migration, you can use a json file using the array syntax. Note that when using a json file, contents should be a valid json array as show. Ensure to keep migrations logical. Its not a good practise to push all migrations to one json file and neither is it wise to in all cases have them each statement in one file.

[
  "CREATE TABLE pytest.sample1(id UInt32, name String) ENGINE MergeTree PARTITION BY tuple() ORDER BY tuple()",
  "CREATE TABLE pytest.sample2(id UInt32, name String) ENGINE MergeTree PARTITION BY tuple() ORDER BY tuple()",
  "CREATE TABLE pytest.sample3(id UInt32, name String) ENGINE MergeTree PARTITION BY tuple() ORDER BY tuple()"
]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clickhouse-migrator-1.0.4.tar.gz (3.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clickhouse_migrator-1.0.4-py3-none-any.whl (4.3 kB view details)

Uploaded Python 3

File details

Details for the file clickhouse-migrator-1.0.4.tar.gz.

File metadata

  • Download URL: clickhouse-migrator-1.0.4.tar.gz
  • Upload date:
  • Size: 3.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.3 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.6.8

File hashes

Hashes for clickhouse-migrator-1.0.4.tar.gz
Algorithm Hash digest
SHA256 34350657a2449e7448331046e9f65802361f6d6c79c1d81e90dec198cc6953ba
MD5 adc921407d1741283e9b4d7ac62ca3f4
BLAKE2b-256 7e360d7a30ea0a19d482a47af9193f078386d5d1c18daefc06f120bb5572be5d

See more details on using hashes here.

File details

Details for the file clickhouse_migrator-1.0.4-py3-none-any.whl.

File metadata

  • Download URL: clickhouse_migrator-1.0.4-py3-none-any.whl
  • Upload date:
  • Size: 4.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.3 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.6.8

File hashes

Hashes for clickhouse_migrator-1.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 c6f8ce797f391f5c074b5176309522541896cbc3f165d74e9be876c46316dcc0
MD5 bf200c563723681ac152f35cb5dc627e
BLAKE2b-256 d168029f3e32d650ecc2510191e42dd6bf3286a2481cfada8b2034826e3cf5b6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page