Skip to main content

Generated from aind-library-template

Project description

aind-data-migration-utils

License Code Style semantic-release: angular Interrogate Coverage Python

Installation

pip install aind-data-migration-utils

Usage

To use the Migrator object, you need to create a DocDB query and a callback. The callback should take a full metadata record as input and return the same metadata record, with any modifications you need to make. Note that you will only have access to core metadata files that you specifically request using Migrator(files: List[str]).

There are two main arguments that control the Migrator class and how it runs:

  • Migrator(test_mode: bool) controls whether or not to run the migrator over all records or just a single record. This is useful when you are running a large migration and want to modify just a single file in production.
  • .run(full_run: bool) whether to actually modify records on the DocDB server

Running a dry run stores a hash that tracks what the dry run was completed on. You cannot run a full run until a hash for that dry run is completed.

The full process of running a migration is:

  1. Define your query and callback, make sure to use logging to clearly explain what happened to each record and use the files parameter to limit your request to just the core files you are modifying.
  2. Run you dry run, the hash file should get generated so that you can run your full run.
  3. Open your PR and get confirmation that your code works properly.
  4. Run your full run.
  5. Merge the PR.

If your code modifies large numbers of records, split step 4 into three partial steps: (a) re-run the dry run with the --test flag to modify only a single record, (b) run the full run with the --test flag and check using metadata-portal.allenneuraldynamics.org/view?name=<your-asset-name> that the record was modified properly, (c) re-run the full dry and full runs.

Example

from aind_data_migration_utils.migrate import Migrator
import argparse
import logging

# Create a docdb query
query = {
    "_id": {"_id": "your-id-to-fix"}
}

def your_callback(record: dict) -> dict:
    """ Make changes to a record """

    # For example, convert a subject ID that wasn't a string to a string
    if not isinstance(record["subject"]["subject_id"], str):
        original_type = type(record["subject"]["subject_id"])
        record["subject"]["subject_id"] = str(record["subject"]["subject_id"])
        logging.info(f"Modified type of subject_id field for record {record["name"]} from {original_type} to str)")
    
    # Note: raising Exceptions inside a callback will log errors in the results.csv file

    return record


if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument("--full-run", action=argparse.BooleanOptionalAction, required=False, default=False)
    parser.add_argument("--test", action=argparse.BooleanOptionalAction, required=False, default=False)
    args = parser.parse_args()

    migrator = Migrator(
        query=query,
        migration_callback=your_callback,
        test_mode=args.test,
        files=["subject"],
        prod=True,
    )
    migrator.run(full_run=args.full_run)

Call your code to run the dry run. You can run multiple dry runs as needed.

python run.py

After completing a dry run for your specific query, pass the --full-run argument to push changes to DocDB.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind_data_migration_utils-0.6.0.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aind_data_migration_utils-0.6.0-py3-none-any.whl (10.6 kB view details)

Uploaded Python 3

File details

Details for the file aind_data_migration_utils-0.6.0.tar.gz.

File metadata

File hashes

Hashes for aind_data_migration_utils-0.6.0.tar.gz
Algorithm Hash digest
SHA256 c3aa8d053476d540b69ea2f5b680b0b840c9f99c2ea78931ac89a0f17b88e604
MD5 76b252e0a8e7f6582147d24935a5e989
BLAKE2b-256 7bb7d22d07372aa448e261ea50f00d1d93c59328d01a761769aa7832952ee5e8

See more details on using hashes here.

File details

Details for the file aind_data_migration_utils-0.6.0-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_data_migration_utils-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 288c98baa30ab45ddbced1067a39762e2eaa5ebceb4dffe6e2bb08e48361bb18
MD5 37f378b1772c3996c83193ecad0a725c
BLAKE2b-256 a80a97ba741815abefdb0563b1e0ca217f7d6290be2e505b245ac5fce282a53b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page