Skip to main content

Heimdall: A Comprehensive Paradigm for Evaluating Single-Cell Representations within Foundational Models

Project description

Lint

Heimdall

Installation

# Clone repository
git clone https://github.com/gkrieg/Heimdall && cd Heimdall

# Create conda env
conda create --name heimdall python=3.10 && conda activate heimdall

# Install dependencies
pip install torch==2.0.1+cu118 --index-url https://download.pytorch.org/whl/cu118
pip install -r requirements.txt

# Install Heimdall (in editable `-e` mode)
pip install -e .

Quickstart

train.py provides a clear overview of the inputs needed, how to prepare the data, model, optimizer, and run the trainer.

python train.py +experiments=cta_pancreas

Make sure to edit the global file config/global_vars.yaml based on your set up.

Sweeps

scripts/create_sweep.py has the arguments --experiment-name (the hydra experiment file name), --project-name (W&B project name), --fg and --fc which are the names of the hydra configs. It is a short script that will load in sweeps/base.yaml and updates it appropriately, and creates a sweep argument and returns it. This can work in tandem with deploy_sweep.sh to submit multiple sweeps on SLURM systems.

python scripts/create_sweep.py --experiment-name cta_pancreas --project-name Pancreas-Celltype-Classification

Dev Notes

Dev installation

pip install -r requirements.txt

Once the pre-commit command line tool is installed, every time you commit some changes, it will perform several code-style checks and automatically apply some fixes for you (if there is any issue). When auto-fixes are applied, you need to recommit those changes. Note that this process can take more than one round.

After you are done committing changes and are ready to push the commits to the remote branch, run nox to perform a final quality check. Note that nox is linting only and does not fix the issues for you. You need to address the issues manually based on the instructions provided.

Cheatsheet

# Run cell type classification dev experiment with wandb disabled
WANDB_MODE=disabled python train.py +experiments=cta_pancreas

# Run cell type classification dev experiment with wandb offline mode
WANDB_MODE=offline python train.py +experiments=cta_pancreas

# Run cell cell interaction dev experiment with wandb disabled
WANDB_MODE=disabled python train.py +experiments=cta_pancreas

# Run cell cell interaction dev experiment with wandb disabled and overwrite epochs
WANDB_MODE=disabled python train.py +experiments=cta_pancreas tasks.args.epochs=2

# Run cell cell interaction dev experiment with user profile (dev has wandb disabled by default)
python train.py +experiments=cta_pancreas user=lane-remy-dev

Nox

Run code linting and unittests:

nox

Run dev experiments test on Lane compute node with CUDA (lane-shared-dev user profile):

nox -e test_experiments

Run fast dev experiments (only selected small datasets):

nox -e test_experiments -- quick_run

Run full dev experiments (including those datasets):

nox -e test_experiments -- full_run

Run dev experiments with a different user profile:

nox -e test_experiments -- user=box-remy-dev

Local tests

We use pytest to write local tests. New test suites can be added under tests/test_{suite_name}.py.

Run a particular test suite with:

python -m pytest tests/test_{suite_name}.py

Run all tests but the integration test:

python -m pytest -m 'not integration'

Note: to run the integration test, you'll need to specify the Hydra user using a .env file. The contents of the file should be like so:

HYDRA_USER=test

Turning off caching

To turn off dataset caching for dev purposes, set cache_preprocessed_dataset_dir: null in config/global_vars.yaml. Alternatively, pass cache_preprocessed_dataset_dir=null through the command line, e.g.,

python train.py +experiments=cta_pancreas cache_preprocessed_dataset_dir=null

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sc_heimdall-2.2.0.tar.gz (37.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sc_heimdall-2.2.0-py3-none-any.whl (43.0 kB view details)

Uploaded Python 3

File details

Details for the file sc_heimdall-2.2.0.tar.gz.

File metadata

  • Download URL: sc_heimdall-2.2.0.tar.gz
  • Upload date:
  • Size: 37.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.2

File hashes

Hashes for sc_heimdall-2.2.0.tar.gz
Algorithm Hash digest
SHA256 4f29a38e949151f8c91261bc922333b70f949a52bf3e71dfcfbe1395bed4b129
MD5 6141c6342a85963b983f588fab1721c1
BLAKE2b-256 457f11c99f7e32848f3de3da7ef6da6c08a07d5a17a0317e6e48c94439f8add7

See more details on using hashes here.

File details

Details for the file sc_heimdall-2.2.0-py3-none-any.whl.

File metadata

  • Download URL: sc_heimdall-2.2.0-py3-none-any.whl
  • Upload date:
  • Size: 43.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.2

File hashes

Hashes for sc_heimdall-2.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bdbfaef00c6790b4791422823ce66660b40e852c85e31acf09044f8afbf06c06
MD5 9a1dc204062c72246a27e4a6e6b9cfaf
BLAKE2b-256 e03416fca7c89e890f1d131905c0b71437690fdc5671a8e1ffd8960e3fa6993a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page