Skip to main content

A short description of the project.

Project description

Brainglobe array batch script file creation

Using the following function will write an array job file and will also create a file called commands.txt with a list of all the commands that will be run when the array script is launched.

Its probably important not to edit this file while the job is running.

Existing output directories are skipped.

Rawdata directory is assumed to be in NIU neuroblueprint format: https://neuroblueprint.neuroinformatics.dev/latest/specification.html

probably the only things that matter are:

  • rawdata folder contains a folder for each mouse
  • derivatives data is the same as the rawdata folder but with /rawdata/ replaced by /derivatives/ (changing this might break things)
  • whole brain images can be kept in a different location to the "rawdata" and this should work as long as the mouse folder names match..

Go to hpc_scripts/slurm_config and set the parameters according to your need. Make sure to edit the email to be your email

slurm_params = {
    "time_limit": "3-0:0", # 3 days, 0 hours, 0 minutes
    "n_jobs": 10,
    "n_jobs_at_a_time": 4,
    "user_email": "",
    "memory_limit": 60,

}

Look in hpc_scripts/brainglobe/brainreg_array_job_constructor.py and edit the arguments in the main() function. Also the paths to the data should be changed to yours.


def main():
    atlas = "allen_mouse_10um"
    overwrite_existing=False
    rawdata_directory = Path("/path/to/rawdata/")   
    serial2p_directory_raw = Path(""/path/to/whole_brains/")
    array_job_outpath=""/path/to/batch_scripts/" 


Running scripts

To run scripts you should submit jobs via slurm. For this you will need to have your python code installed remotely and you will need a batch script, written in bash, that will be used to call jobs on the HPC.

First SSH to the swc network.

ssh user@ssh.swc.ucl.ac.uk

SSH again to the hpc.

ssh user@hpc-gw2

Then call the sbatch command on the batch script file.

sbatch /path/to/batch_script.sh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hpc_scripts-0.1.0.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hpc_scripts-0.1.0-py3-none-any.whl (10.4 kB view details)

Uploaded Python 3

File details

Details for the file hpc_scripts-0.1.0.tar.gz.

File metadata

  • Download URL: hpc_scripts-0.1.0.tar.gz
  • Upload date:
  • Size: 8.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for hpc_scripts-0.1.0.tar.gz
Algorithm Hash digest
SHA256 0044e1a9d78fdaf40a99d3c88fc06c9b031d68bcd563f197ccb7e5aadbbc25ab
MD5 8f3d12a640cb8c4d98e3aa8f97b15c11
BLAKE2b-256 8f4f601e39a354013f3e5f60de2c2fba3b13907271254c00cc2690c49ccd407a

See more details on using hashes here.

File details

Details for the file hpc_scripts-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: hpc_scripts-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for hpc_scripts-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9b25b0f9127a4a2680ad30b9e642534ac89283a449ecb672ddc9c1db98b6ae65
MD5 c81206749c7e49342e3726ea76f62eff
BLAKE2b-256 7893a64264a5db7239bc9639d626ff4a1ba46938950219b17166380821b10010

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page