Skip to main content

Tool for automated MEG data quality control

Project description

MEGqc - a standardized pipeline for MEG data quality control

Magnetoencephalography (MEG) data are susceptible to noise and artifacts, which can severely corrupt the data quality. They can originate from environmental noise sources, e.g. powerline noise, internal noise sources like data contamination due to eye movements of the subject, or systemic noise sources, e.g. malfunction of a sensor or vibrations. For this reason, quality control of the data is an important step for valid and reproducible science (Niso et al., 2022). However, the visual detection and annotation of artifacts in MEG data require expertise, is a tedious and time extensive task, and hardly resembles a standardized procedure. Since quality control is commonly done manually it might also be subject to biases induced by the person inspecting the data. Despite the minimization of human biases, a standardized workflow for quality control will additionally make datasets better comparable thereby allowing for between-datasets comparisons as opposed to quality control within a single dataset. Hence, an automated and standardized approach to quality control is desirable for the quality assessment of in-house and shared datasets. To address this issue we developed a software tool for automated and standardized quality control of MEG recordings, MEGqc, which is inspired by software for quality control in the domain of fMRI, called mriqc (Esteban et al., 2017).

MEGqc is designed to detect specific noise patterns in the data and visualize them in easily interpretable human-readable reports. Additionally, the calculated metrics are provided in machine-readable JSON files, which allow for better machine interoperability or integration into workflows. Among other measures we calculate the relative power of noise frequencies in the Power Spectral Density (PSD), several metrics to describe the ‘noisiness’ of channels and/or epochs, e.g. STD or peak-to-peak amplitudes, and quantification of EOG and ECG related noise averaged over all channels and on a per-channel basis (see the architecture UML for a list of all computed metrics). The software strives to help researchers to standardize and speed up their quality control workflow. This being said, usability is a central aspect of MEGqc. It requires only minimal user input to work: a path to the dataset and the tuning of a handful of parameters through a human and machine-readable configuration file, but can also be adapted to the specific needs of more experienced users by overriding the default values of respective parameters in the configuration file. However, this simple user interface, e.g. only one path to the dataset is required and the software will locate and load the files needed for the workflow, only works if the structural organization of the dataset is internally known to the software.

Since neuroimaging data can be very diverse concerning their structural organization the software is tailored to the BIDS standard (Gorgolewski et al., 2016; Niso et al., 2018). Thus MEGqc requires the data to be organized according to BIDS.

MEGqc strongly relies on the MNE-python software package (Gramfort et al., 2013).

Documentation, Installation Guide and Tutorial: https://ancplaboldenburg.github.io/megqc_documentation/index.html

The following derivatives are produced as the result of the analysis for each data file (.fif):

  • HTML report for all metrics presented as interactive figures, that can be scrolled through and enlarged;
  • TSV file with the results of the analysis for some of the metrics;
  • machine-readable JSON file with the results of the analysis for all metrics.

Between sample analysis

The package includes a small utility to compare quality metrics between datasets. Assuming you have the per-sample TSV tables created by the MEGqc pipeline, run::

python -m meg_qc.calculation.between_sample_analysis \
    --tsv sample1/group_metrics.tsv sample2/group_metrics.tsv \
    --names sample1 sample2 \
    --output-dir results

All violin plots and regression results will be written to the results directory. Significant regression coefficients are marked with asterisks.

To add a mutual information (MI) analysis, include the --mi flag. The number of permutations for the significance test is controlled via --mi-permutations (use 0 to disable permutation testing). For example::

python -m meg_qc.calculation.between_sample_analysis \
    --tsv sample1/group_metrics.tsv sample2/group_metrics.tsv \
    --names sample1 sample2 \
    --output-dir results \
    --mi --mi-permutations 1000

MI results (raw, net, z-scores, p-values, normalized variants and entropies) are stored in the mutual_information folder for each sample and for the combined dataset.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meg_qc-0.6.7.tar.gz (374.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

meg_qc-0.6.7-py3-none-any.whl (380.0 kB view details)

Uploaded Python 3

File details

Details for the file meg_qc-0.6.7.tar.gz.

File metadata

  • Download URL: meg_qc-0.6.7.tar.gz
  • Upload date:
  • Size: 374.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for meg_qc-0.6.7.tar.gz
Algorithm Hash digest
SHA256 92ead9f9757c523f2b387dda534df5fc51125917db2ce8905e9e617b2909423a
MD5 32f06041a393e5dbc72ace96a7b1a4c8
BLAKE2b-256 cd5ab7353656390dbc7a3ffaa246ff0ba0b981dfa08eb8d503c0345e321a5b14

See more details on using hashes here.

File details

Details for the file meg_qc-0.6.7-py3-none-any.whl.

File metadata

  • Download URL: meg_qc-0.6.7-py3-none-any.whl
  • Upload date:
  • Size: 380.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for meg_qc-0.6.7-py3-none-any.whl
Algorithm Hash digest
SHA256 1722df6f20d630e2f48110e9f23ecffd040ce7e138848bd4e659cb7e6925f40f
MD5 1a7f9fefcefd94a69b0ac27fb3fe358f
BLAKE2b-256 6385a0dd16c2db14a111a1e2cc88f0899b4e1bbe10b70d4421e0d647f13aa652

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page