Skip to main content

Generated from aind-library-template

Project description

aind-data-migration-utils

License Code Style semantic-release: angular Interrogate Coverage Python

Installation

pip install aind-data-migration-utils

Usage

To use the Migrator object, you need to create a DocDB query and a callback. The callback should take a full metadata record as input and return the same metadata record, with any modifications you need to make. Note that you will only have access to core metadata files that you specifically request using Migrator(files: List[str]).

There are two main arguments that control the Migrator class and how it runs:

  • Migrator(test_mode: bool) controls whether or not to run the migrator over all records or just a single record. This is useful when you are running a large migration and want to modify just a single file in production.
  • .run(full_run: bool) whether to actually modify records on the DocDB server

Running a dry run stores a hash that tracks what the dry run was completed on. You cannot run a full run until a hash for that dry run is completed.

The full process of running a migration is:

  1. Define your query and callback, make sure to use logging to clearly explain what happened to each record and use the files parameter to limit your request to just the core files you are modifying.
  2. Run you dry run, the hash file should get generated so that you can run your full run.
  3. Open your PR and get confirmation that your code works properly.
  4. Run your full run.
  5. Merge the PR.

If your code modifies large numbers of records, split step 4 into three partial steps: (a) re-run the dry run with the --test flag to modify only a single record, (b) run the full run with the --test flag and check using metadata-portal.allenneuraldynamics.org/view?name=<your-asset-name> that the record was modified properly, (c) re-run the full dry and full runs.

Example

from aind_data_migration_utils.migrate import Migrator
import argparse
import logging

# Create a docdb query
query = {
    "_id": {"_id": "your-id-to-fix"}
}

def your_callback(record: dict) -> dict:
    """ Make changes to a record """

    # For example, convert a subject ID that wasn't a string to a string
    if not isinstance(record["subject"]["subject_id"], str):
        original_type = type(record["subject"]["subject_id"])
        record["subject"]["subject_id"] = str(record["subject"]["subject_id"])
        logging.info(f"Modified type of subject_id field for record {record["name"]} from {original_type} to str)")
    
    # Note: raising Exceptions inside a callback will log errors in the results.csv file

    return record


if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument("--full-run", action=argparse.BooleanOptionalAction, required=False, default=False)
    parser.add_argument("--test", action=argparse.BooleanOptionalAction, required=False, default=False)
    args = parser.parse_args()

    migrator = Migrator(
        query=query,
        migration_callback=your_callback,
        test_mode=args.test,
        files=["subject"],
        prod=True,
    )
    migrator.run(full_run=args.full_run)

Call your code to run the dry run. You can run multiple dry runs as needed.

python run.py

After completing a dry run for your specific query, pass the --full-run argument to push changes to DocDB.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind_data_migration_utils-0.7.1.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aind_data_migration_utils-0.7.1-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file aind_data_migration_utils-0.7.1.tar.gz.

File metadata

File hashes

Hashes for aind_data_migration_utils-0.7.1.tar.gz
Algorithm Hash digest
SHA256 3c578699b9b480387e3dd36892fec9ea7aad58f9e59e8f1413a6535b10b1177e
MD5 1a16552f9f8e56bbc0f7c7b003343d14
BLAKE2b-256 0582a42cfb253f789a015bcbe05c58598075e52fa4d7305018aeef866d08c903

See more details on using hashes here.

File details

Details for the file aind_data_migration_utils-0.7.1-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_data_migration_utils-0.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3a7b78d75e79cace536e3c7a7e6f98f6afcae2728096f6c907b04f0351be64fd
MD5 0cbb1ff9b756fcb681dfcd5ffcbd83cf
BLAKE2b-256 1d937486063616f0fd63f4db470d4125fdf19332df7d24e9567ac5cabe7889fd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page