Skip to main content

Top-level package for xefab.

Project description

https://img.shields.io/badge/code%20style-black-000000.svg

Fabric based task execution for the XENON dark matter experiment

Installation

To install xefab, its recomended to use pipx:

$ pipx install xefab

Alternatively you can install it using pip:

$ pip install xefab

Usage

You can list the available tasks and options by running xf/xefab without any options.

$ xf
Usage: xf [--core-opts] [<host>] [<subcommand>] task1 [--task1-opts] ... taskN [--taskN-opts]

Core options:

--complete                      Print tab-completion candidates for given parse remainder.
--hide=STRING                   Set default value of run()'s 'hide' kwarg.
--print-completion-script=STRINGPrint the tab-completion script for your preferred shell (bash|zsh|fish).
--prompt-for-login-password     Request an upfront SSH-auth password prompt.
--prompt-for-passphrase         Request an upfront SSH key passphrase prompt.
--prompt-for-sudo-password      Prompt user at start of session for the sudo.password config value.
--write-pyc                     Enable creation of .pyc files.
-d, --debug                     Enable debug output.
-D INT, --list-depth=INT        When listing tasks, only show the first INT levels.
-e, --echo                      Echo executed commands before running.
-f STRING, --config=STRING      Runtime configuration file to use.
-F STRING, --list-format=STRING Change the display format used when listing tasks. Should be one of: flat (default), nested,
                                json.
-h [STRING], --help[=STRING]    Show core or per-task help and exit.
-H STRING, --hosts=STRING       Comma-separated host name(s) to execute tasks against.
-i, --identity                  Path to runtime SSH identity (key) file. May be given multiple times.
-l [STRING], --list[=STRING]    List available tasks, optionally limited to a namespace.
-p, --pty                       Use a pty when executing shell commands.
-R, --dry                       Echo commands instead of running.
-S STRING, --ssh-config=STRING  Path to runtime SSH config file.
-t INT, --connect-timeout=INT   Specifies default connection timeout, in seconds.
-T INT, --command-timeout=INT   Specify a global command execution timeout, in seconds.
-V, --version                   Show version and exit.
-w, --warn-only                 Warn, instead of failing, when shell commands fail.


Subcommands:

show-context                      Show the context being used for tasks.
admin.user-db
dali.download-file                Download a file from a remote server.
dali.sbatch                       Create and submit a job to SLURM job queue on the remote host.
dali.show-context                 Show the context being used for tasks.
dali.squeue (dali.job-queue)      Get the job-queue status.
dali.start-jupyter                Start a jupyter analysis notebook on the remote host and forward to local port via ssh-tunnel.
dali.upload-file                  Upload a file to a remote server.
github.clone
github.is-private
github.is-public
github.token
github.xenon1t-members
github.xenonnt-keys
github.xenonnt-members
install.chezmoi
install.get-system
install.github-cli (install.gh)
install.gnupg (install.gpg)
install.go
install.gopass
install.miniconda (install.conda)
install.which
midway.download-file              Download a file from a remote server.
midway.sbatch                     Create and submit a job to SLURM job queue on the remote host.
midway.show-context               Show the context being used for tasks.
midway.squeue (midway.job-queue)  Get the job-queue status.
midway.start-jupyter              Start a jupyter analysis notebook on the remote host and forward to local port via ssh-tunnel.
midway.upload-file                Upload a file to a remote server.
midway3.download-file             Download a file from a remote server.
midway3.sbatch                    Create and submit a job to SLURM job queue on the remote host.
midway3.show-context              Show the context being used for tasks.
midway3.squeue (midway3.job-queue)Get the job-queue status.
midway3.start-jupyter             Start a jupyter analysis notebook on the remote host and forward to local port via ssh-tunnel.
midway3.upload-file               Upload a file to a remote server.
osg.condorq (osg.job-queue)
osg.mc-chain                      Run a full chain MC simulation
secrets.setup
secrets.setup-utilix-config
sh.exists
sh.get-system
sh.is-dir
sh.is-file
sh.path
sh.shell (sh)                     Open interactive shell on remote host.
sh.which

You can get help for a specific task by running e.g.

$ xf --help midway3.start-jupyter
╭─ start-jupyter ───────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ xf [--core-opts] start-jupyter [--options][other tasks here ...]                                                              │
│                                                                                                                               │
│ Start a jupyter analysis notebook on the remote host and forward to local port via ssh-tunnel.                                │
│                                                                                                                               │
│ Options:                                                                                                                      │
│ --image-dir=STRING                              Directory to look for singularity images                                      │
│ --remote-port=STRING                            Port to use for jupyter server to on the worker node                          │
│ --=INT, --local-port=INT                        Local port to attempt to forward to (if free)                                 │
│ -a INT, --max-hours=INT                         Maximum number of hours to run for                                            │
│ -b, --bypass-reservation                        Dont attempt to use the xenon notebook reservation                            │
│ -c INT, --cpu=INT                               Number of CPUs to request                                                     │
│ -d, --detached                                  Run the job and exit, dont perform cleanup tasks.                             │
│ -e STRING, --env=STRING                         Environment to run on                                                         │
│ -f, --force-new                                 Force a new job to be started                                                 │
│ -g, --gpu                                       Use a GPU                                                                     │
│ -i STRING, --binds=STRING                       Directories to bind to the container                                          │
│ -j STRING, --jupyter=STRING                     Type of jupyter server to start (lab or notebook)                             │
│ -l, --local-cutax                               Use user installed cutax (from ~/.local)                                      │
│ -m INT, --timeout=INT                           Timeout for the job to start                                                  │
│ -n STRING, --node=STRING                        Node to run on                                                                │
│ -o STRING, --notebook-dir=STRING                Directory to start the notebook in                                            │
│ -p STRING, --partition=STRING                   Partition to run on (xenon1t or dali)                                         │
│ -r INT, --ram=INT                               Amount of RAM to allocate (in MB)                                             │
│ -t STRING, --tag=STRING                         Tag of the container to use                                                   │
│ -u, --debug                                     Print debug information                                                       │
│ -w, --no-browser                                Dont open the browser automatically when done                                 │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

Some tasks are registered to run on a specific host. When you run them, the –hosts option will be ignored.

e.g. if you run

$ xf midway3 start-jupyter

The task will be run on the midway3 host, not the host you specified with –hosts.

Features

  • TODO

Credits

This package was created with Cookiecutter and the briggySmalls/cookiecutter-pypackage project template.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xefab-0.1.13.tar.gz (39.0 kB view details)

Uploaded Source

Built Distribution

xefab-0.1.13-py3-none-any.whl (45.1 kB view details)

Uploaded Python 3

File details

Details for the file xefab-0.1.13.tar.gz.

File metadata

  • Download URL: xefab-0.1.13.tar.gz
  • Upload date:
  • Size: 39.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.11.0 Linux/6.2.0-1012-azure

File hashes

Hashes for xefab-0.1.13.tar.gz
Algorithm Hash digest
SHA256 0bb8eca7100a8f862835e763daf491e8f9809e4ffd8383b0637afa11c0b04b99
MD5 eff6c91770a19304a05266f6131ea284
BLAKE2b-256 2ded718aa5f3b5e5e4d945ac4f7b2e0e384d2a612dbc0664fb44eefa60757346

See more details on using hashes here.

File details

Details for the file xefab-0.1.13-py3-none-any.whl.

File metadata

  • Download URL: xefab-0.1.13-py3-none-any.whl
  • Upload date:
  • Size: 45.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.11.0 Linux/6.2.0-1012-azure

File hashes

Hashes for xefab-0.1.13-py3-none-any.whl
Algorithm Hash digest
SHA256 9a1495c0002f93a9030001df7d49f864ce51c3306c6888e5b897f4952105f628
MD5 7c16603de4122a3d47b9a5c040ce8676
BLAKE2b-256 6db414222b4a534a7370e39e157de86d9b27d581b07c0c25d9fa17842e37daa8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page