Skip to main content

Collection of NOMAD parsers for workflow engines.

Project description

This is a collection of the NOMAD parsers for the following workflow codes:

  1. AFLOW
  2. ASR
  3. Atomate
  4. ElaStic
  5. FHI-vibes
  6. LOBSTER
  7. phonopy
  8. QuantumEspressoEPW
  9. QuantumEspressPhonon
  10. QuantumEspressoXSpectra

Preparing code input and output file for uploading to NOMAD

An upload is basically a directory structure with files. If you have all the files locally you can just upload everything as a .zip or .tar.gz file in a single step. While the upload is in the staging area (i.e. before it is published) you can also easily add or remove files in the directory tree via the web interface. NOMAD will automatically try to choose the right parser for you files.

For each parser there is one type of file that the respective parser can recognize. We call these files mainfiles. For each mainfile that NOMAD discovers it will create an entry in the database, which users can search, view, and download. NOMAD will consider all files in the same directory as auxiliary files that also are associated with that entry. Parsers might also read information from these auxillary files. This way you can add more files to an entry, even if the respective parser/code might not use them. However, we strongly recommend to not have multiple mainfiles in the same directory. For CMS calculations, we recommend having a separate directory for each code run.

Go to the NOMAD upload page to upload files or find instructions about how to upload files from the command line.

Using the parser

You can use NOMAD's parsers and normalizers locally on your computer. You need to install NOMAD's pypi package:

pip install nomad-lab

To parse code input/output from the command line, you can use NOMAD's command line interface (CLI) and print the processing results output to stdout:

nomad parse --show-archive <path-to-file>

To parse a file in Python, you can program something like this:

import sys
from nomad.cli.parse import parse, normalize_all

# match and run the parser
archive = parse(sys.argv[1])
# run all normalizers
normalize_all(archive)

# get the 'main section' section_run as a metainfo object
section_run = archive.section_run[0]

# get the same data as JSON serializable Python dict
python_dict = section_run.m_to_dict()

Developing the parser

Create a virtual environment to install the parser in development mode:

pip install virtualenv
virtualenv -p `which python3` .pyenv
source .pyenv/bin/activate

Install NOMAD's pypi package:

pip install nomad-lab

Clone the parser project and install it in development mode:

git clone https://github.com/nomad-coe/workflow-parsers.git workflow-parsers
pip install -e workflow-parsers

Running the parser now, will use the parser's Python code from the clone project.

Project details


Release history Release notifications | RSS feed

This version

1.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nomad_parser_plugins_workflow-1.0.tar.gz (80.5 kB view details)

Uploaded Source

Built Distribution

nomad_parser_plugins_workflow-1.0-py3-none-any.whl (107.6 kB view details)

Uploaded Python 3

File details

Details for the file nomad_parser_plugins_workflow-1.0.tar.gz.

File metadata

File hashes

Hashes for nomad_parser_plugins_workflow-1.0.tar.gz
Algorithm Hash digest
SHA256 1485d0a8d14a293cd22f873d7486ca528eddf20fc6964f4b99b12e73fc2167b3
MD5 54a23206112aedcc1ab5b41e0bfeb20d
BLAKE2b-256 fe1466e9027d404f07c106d8698e4a8e1e0cf4504a13bd948f7a01c999368122

See more details on using hashes here.

File details

Details for the file nomad_parser_plugins_workflow-1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for nomad_parser_plugins_workflow-1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 221915010af3d3f25e6bcd009101da1430eb9955ef4fc6b88114f3b7a482e01a
MD5 e5eaa968d10ba4be53f3b7fb7f2f9eb6
BLAKE2b-256 3159016925d317e2dda3092b5a353db018c8cbdf38315256a56f7545ffb21fe9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page