Skip to main content

Generated from aind-library-template

Project description

aind-data-migration-utils

License Code Style semantic-release: angular Interrogate Coverage Python

Installation

pip install aind-data-migration-utils

Usage

To use the Migrator object, you need to create a DocDB query and a callback. The callback should take a full metadata record as input and return the same metadata record, with any modifications you need to make. Note that you will only have access to core metadata files that you specifically request using Migrator(files: List[str]).

There are two main arguments that control the Migrator class and how it runs:

  • Migrator(test_mode: bool) controls whether or not to run the migrator over all records or just a single record. This is useful when you are running a large migration and want to modify just a single file in production.
  • .run(full_run: bool) whether to actually modify records on the DocDB server

Running a dry run stores a hash that tracks what the dry run was completed on. You cannot run a full run until a hash for that dry run is completed.

The full process of running a migration is:

  1. Define your query and callback, make sure to use logging to clearly explain what happened to each record and use the files parameter to limit your request to just the core files you are modifying.
  2. Run you dry run, the hash file should get generated so that you can run your full run.
  3. Open your PR and get confirmation that your code works properly.
  4. Run your full run.
  5. Merge the PR.

If your code modifies large numbers of records, split step 4 into three partial steps: (a) re-run the dry run with the --test flag to modify only a single record, (b) run the full run with the --test flag and check using metadata-portal.allenneuraldynamics.org/view?name=<your-asset-name> that the record was modified properly, (c) re-run the full dry and full runs.

Example

from aind_data_migration_utils.migrate import Migrator
import argparse
import logging

# Create a docdb query
query = {
    "_id": {"_id": "your-id-to-fix"}
}

def your_callback(record: dict) -> dict:
    """ Make changes to a record """

    # For example, convert a subject ID that wasn't a string to a string
    if not isinstance(record["subject"]["subject_id"], str):
        original_type = type(record["subject"]["subject_id"])
        record["subject"]["subject_id"] = str(record["subject"]["subject_id"])
        logging.info(f"Modified type of subject_id field for record {record["name"]} from {original_type} to str)")
    
    # Note: raising Exceptions inside a callback will log errors in the results.csv file

    return record


if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument("--full-run", action=argparse.BooleanOptionalAction, required=False, default=False)
    parser.add_argument("--test", action=argparse.BooleanOptionalAction, required=False, default=False)
    args = parser.parse_args()

    migrator = Migrator(
        query=query,
        migration_callback=your_callback,
        test_mode=args.test,
        files=["subject"],
        prod=True,
    )
    migrator.run(full_run=args.full_run)

Call your code to run the dry run. You can run multiple dry runs as needed.

python run.py

After completing a dry run for your specific query, pass the --full-run argument to push changes to DocDB.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind_data_migration_utils-0.7.0.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aind_data_migration_utils-0.7.0-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file aind_data_migration_utils-0.7.0.tar.gz.

File metadata

File hashes

Hashes for aind_data_migration_utils-0.7.0.tar.gz
Algorithm Hash digest
SHA256 cfbddcb52734097827bc785a5783efaba155854cdaae585cb6de7a1a0a16f9ec
MD5 336a9b5e55f798143dc843a0587663a8
BLAKE2b-256 d9f49dac07099261a02b66ddb43f4a98f5e099125b3ab348366961109e09719d

See more details on using hashes here.

File details

Details for the file aind_data_migration_utils-0.7.0-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_data_migration_utils-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d101b17b9a286b955c906c60c7bfd766267a109fd286f3ca1ce59853aabcf622
MD5 f0441f0b92e50f6f80db1c82c2d0b5ef
BLAKE2b-256 301b552641a2d662ca8fd2afdc5f637e96ba3a7528f861015437729ee5698d6c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page