Skip to main content

A modular framework for automated EEG data processing, built on MNE-Python

Project description

AutoCleanEEG Pipeline

Python License Code style: black

A modular framework for automated EEG data processing, built on MNE‑Python.

Features

  • Framework for automated EEG preprocessing with "lego block" modularity
  • Support for multiple EEG paradigms (ASSR, Chirp, MMN, Resting State)
  • BIDS-compatible data organization and comprehensive quality control
  • Extensible plugin system for file formats, montages, and event processing
  • Research-focused workflow: single file testing → parameter tuning → batch processing
  • Detailed output: BIDS‑compatible derivatives, single task log file, stage files, exports, and QA visualizations

Installation (uv)

Use Astral's uv for fast, isolated installs. If you don't have uv yet, see https://docs.astral.sh/uv/

  • Install CLI (recommended for users):
uv tool install autocleaneeg-pipeline
autocleaneeg-pipeline --help
  • Upgrade or remove:
uv tool upgrade autocleaneeg-pipeline
uv tool uninstall autocleaneeg-pipeline
  • Development install from source:
git clone https://github.com/cincibrainlab/autocleaneeg_pipeline.git
cd autoclean_pipeline
uv tool install -e --upgrade .

Quick Start

Process a file using a built-in task:

autocleaneeg-pipeline process RestingEyesOpen /path/to/data.raw

List tasks and show overrides:

autocleaneeg-pipeline task list


## Output Structure

Each processing task writes to a self‑contained folder under your chosen output directory. The structure is designed to keep task‑level artifacts at the task root while maintaining a clean BIDS derivatives tree.

Example (per task):

/ bids/ dataset_description.json derivatives/ dataset_description.json 01_import/ 02_resample/ ... 16_comp/ sub-/eeg/... (primary BIDS data written by mne-bids)

exports/ # Final exported files and convenience copies (CSV/log) ica/ # ICA FIF files + ica_control_sheet.csv logs/ pipeline.log # Single consolidated log for all runs in this task qa/ *_fastplot_summary.(tiff|png) reports/ run_reports/ *_autoclean_report.pdf *_processing_log.csv # Per-file processing CSVs *_autoclean_report_flagged_channels.tsv

preprocessing_log.csv # Combined, task-level processing log (no task prefix)


Key points:
- Task‑root folders use concise names: `exports/`, `ica/`, `logs/`, `qa/`, `reports/`.
- Stage files go directly under `bids/derivatives/` as numbered folders (no `intermediate/`).
- No reports or per‑subject folders are created in derivatives.
- `dataset_description.json` is present at both `bids/` and `bids/derivatives/`.


## BIDS + Branding

- The BIDS `dataset_description.json` is post‑processed to:
  - Set `Name` to the task name.
  - Add `GeneratedBy` entry for `autocleaneeg-pipeline` with version.
  - Remove placeholder Authors inserted by MNE‑BIDS if present.


## Logs

- A single log file per task lives at `<task>/logs/pipeline.log`.
- Console output level matches your `--verbose` choice; file logs capture the same level.
- If you want rotation (e.g., `10 MB`), we can enable it; default is a single growing file.


## Processing Logs (CSV)

- Per‑file: `<task>/reports/run_reports/<basename>_processing_log.csv`.
- Combined (task‑level): `<task>/preprocessing_log.csv` (no taskname prefix).
- A convenience copy of the per‑file CSV is dropped into `exports/`.


## ICA Artifacts

- ICA FIF files and the editable control sheet live in `<task>/ica/`:
  - `<task>/ica/<basename>-ica.fif`
  - `<task>/ica/ica_control_sheet.csv`


## QA Visualizations

- Fastplot summary images go to `<task>/qa/`.
- The review GUI auto‑discovers images from `reports/` and `qa/`.


## Removed Legacy Folders

This release removes the old locations and naming used during development:
- No `metadata/` folder at the task root (JSONs are in `reports/run_reports/`).
- No `final_files/` or `final_exports/` (use `exports/`).
- No `ica_fif/` (use `ica/`).
- No `qa_review_plots/` (use `qa/`).
- No versioned derivatives folder (e.g., `autoclean-vX`) — derivatives are directly under `bids/derivatives/`.


## CLI Tips

- Process a single file:

```bash
autocleaneeg-pipeline process RestingEyesOpen /path/to/file.set
  • Open the review GUI for an output directory:
autocleaneeg-pipeline review --output /path/to/output
  • Apply ICA control‑sheet edits (reads <task>/ica/ica_control_sheet.csv by default when a metadata path is provided):
autocleaneeg-pipeline process ica --metadata-dir /path/to/task/reports


## Documentation

Full documentation is available at [https://docs.autocleaneeg.org](https://docs.autocleaneeg.org)

## Contributing

We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Acknowledgments

- Cincinnati Children's Hospital Research Foundation
- Built with [MNE-Python](https://mne.tools/)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autocleaneeg_pipeline-2.2.7.tar.gz (546.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

autocleaneeg_pipeline-2.2.7-py3-none-any.whl (485.5 kB view details)

Uploaded Python 3

File details

Details for the file autocleaneeg_pipeline-2.2.7.tar.gz.

File metadata

File hashes

Hashes for autocleaneeg_pipeline-2.2.7.tar.gz
Algorithm Hash digest
SHA256 cbba2f4287a16c9486cfe15dde711745cc4ddae7b089eccd9e1fbbdbc5bf6b10
MD5 e9b36c850637ca61a919b2bd453d36c4
BLAKE2b-256 e747fd870f942c2c38ad183c368450b0ccc33c4673e3ba3d114d2eeffa56e334

See more details on using hashes here.

File details

Details for the file autocleaneeg_pipeline-2.2.7-py3-none-any.whl.

File metadata

File hashes

Hashes for autocleaneeg_pipeline-2.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 8f11cad30b73f6f0b50696191378534dec48bb98a489217e6054923bb44788be
MD5 03b9b11a01597aea5507d1dc794da487
BLAKE2b-256 8c71f93a1e5c87624bb6bb09fe85de776169130d6b32ad03f89442a4465db103

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page