Skip to main content

Q-Cloud CLI for users

Project description


Q-Cloud User Documentation


Setup

Before submitting any calculations, you will need to install and configure the Q-Cloud command line interface:

python3 -m pip install  qcloud_user
qcloud --configure

You will be prompted for several configuration values that can be obtained from your Q-Cloud administrator. Alternatively, if your administrator has provided these details in a file, then you can provide the file name as an argument:

qcloud --configure user_info.txt

You should have received an email with an initial password for your account, and you will be prompted to change this the first time you attempt to submit a job.

Job Control

Submitting Jobs

Use the --submit option to submit Q-Chem jobs to the cluster, e.g.:

qcloud --submit job1.inp job2.inp [...]

Several jobs can be submitted at the same time and they will be submitted with the default queue parameters. If there are no compute nodes available, the jobs will sit in the QUEUED state for a couple of minutes while a fresh compute node is launched and configured. Once the queue has cleared, the compute nodes will automatically shut down after the configure time frame (default is five minutes).

Jobs will be submitted with the default queue parameters which are determined during the cluster setup (contact your QCloud administrator for details). Scratch space is set explicitly, memory is determined by the instance type selected and compute time is unlimited. If each job is run on a separate instance (by requesting all the instance cores) then these are the relevant default values.

If you want to override these values, or pass additional parameters to the SLUM scheduler, then you can add these to the first line of the input file as you would specify command line options to sbatch. For example: line of the Q-Chem input file. For example, the following limits the job to 1 hour and memory to 4G:

--time=1:00:00  --mem=4G
$molecule
0  1
he
$end
$rem
...

The number of threads can be specified by using the --ncpu flag, for example:

qcloud --submit --ncpu 4 job1.inp 

Note that if the number of threads specified exceeds the number of cores on an individual compute node, the job will not run. Your QCloud administrator will be able to inform you what this limit is.

If the job submission is successful, a unique job identifier will be returned:

[✓] Submitted job id gv6uqutvNmU0:             helium

A local registry of these IDs is kept, so it is not essential to use them in the commands below. However, they may be required to disambiguate multiple jobs submitted with the same input file basename.

Monitoring Jobs

To monitor the progress of jobs, use the --status option along with a string, which could be the file name, job ID or substring:

qcloud --status <jobid|jobname> 

The progress of jobs in the RUNNING state can be obtained using:

qcloud --tail <jobid|jobname> 

A job in the QUEUED or RUNNING state can be cancelled, which will remove it from the queue:

qcloud --cancel <jobid|jobname>

Downloading Results

Once a job in in the ARCHIVED state, it can be downloaded from the S3 bucket onto the local machine:

qcloud --get <pattern> 

The download will create a new directory with the same basename as the input file containing the output from the calculation.

Jobs in the DOWNLOADED state can be cleared from the job registry on the local machine:

qcloud --clear <pattern> 

Note that this does not remove the results from the S3 bucket. If you want to remove the job from the registry regardless of status, use the --remove option.

Other commands

The following will give a full list of commands available using the CLI:

qcloud --help

Troubleshooting

If you encounter additional problems not covered in this guide, please contact your Q-Cloud administrator or email support@q-chem.com for assistance.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qcloud_user-1.0.16.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qcloud_user-1.0.16-py3-none-any.whl (17.8 kB view details)

Uploaded Python 3

File details

Details for the file qcloud_user-1.0.16.tar.gz.

File metadata

  • Download URL: qcloud_user-1.0.16.tar.gz
  • Upload date:
  • Size: 10.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.7

File hashes

Hashes for qcloud_user-1.0.16.tar.gz
Algorithm Hash digest
SHA256 f0dd757d8c340526f9b45b7ec1dba6a65a51772efcaea9597da00401af8a5d9d
MD5 500c3a3461b89173b6e701923ce651f4
BLAKE2b-256 0b86f6f44dc8a2ee11162b762888af0ee6cd567d50bd82f1bd2c23d49d2e34d2

See more details on using hashes here.

File details

Details for the file qcloud_user-1.0.16-py3-none-any.whl.

File metadata

  • Download URL: qcloud_user-1.0.16-py3-none-any.whl
  • Upload date:
  • Size: 17.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.7

File hashes

Hashes for qcloud_user-1.0.16-py3-none-any.whl
Algorithm Hash digest
SHA256 b1d606c0c3a84b0523e0c47ef42e7e9425e1718c4aef04a1906a31230d53e5fb
MD5 a5fe429f1fcaa06de914c263d89eaa92
BLAKE2b-256 ae1677fc918c07c729e29b0a1c26aeda844b5f00f92c91e8ad1369654580a8d8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page