Skip to main content

Python library and app to extract images from DCM in private-eye format

Project description

Process DCM

Maintenance GitHub GitHub release (latest by date) GitHub Release PyPI Poetry Ruff pre-commit

About The Project

Python library and app to extract images from DCM files with metadata in private-eye format

Installation and Usage

pip install process-dcm
 Usage: process-dcm [OPTIONS] INPUT_DIR

 Process DICOM files in subfolders, extract images and metadata using parallel processing.
 Version: 0.4.2

╭─ Arguments ─────────────────────────────────────────────────────────────────────────────────────────────────╮
│ *    input_dir      TEXT  Input directory containing subfolders with DICOM files. [default: None]           │
│                           [required]                                                                        │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ───────────────────────────────────────────────────────────────────────────────────────────────────╮
│ --image_format        -f      TEXT     Image format for extracted images (png, jpg, webp). [default: png]   │
│ --output_dir          -o      TEXT     Output directory for extracted images and metadata.                  │
│                                        [default: exported_data]                                             │
│ --group               -g               Re-group DICOM files in a given folder by AcquisitionDateTime.       │
│ --tol                 -t      INTEGER  Tolerance in seconds for grouping DICOM files by                     │
│                                        AcquisitionDateTime.                                                 │
│                                        [default: 2]                                                         │
│ --relative            -r               Save extracted data in folders relative to _input_dir_.              │
│ --n_jobs              -j      INTEGER  Number of parallel jobs. [default: 1]                                │
│ --mapping             -m      TEXT     Path to CSV containing patient_id to study_id mapping. If not        │
│                                        provided and patient_id is not anonymised, a 'study_2_patient.csv'   │
│                                        file will be generated                                               │
│ --keep                -k      TEXT     Keep the specified fields (p: patient_key, n: names, d:              │
│                                        date_of_birth, D: year-only DOB, g: gender)                          │
│ --overwrite           -w               Overwrite existing images if found.                                  │
│ --quiet               -q               Silence verbosity.                                                   │
│ --version             -V               Prints app version.                                                  │
│ --install-completion                   Install completion for the current shell.                            │
│ --show-completion                      Show completion for the current shell, to copy it or customize the   │
│                                        installation.                                                        │
│ --help                -h               Show this message and exit.                                          │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

For Developers

To run this project locally, you will need to install the prerequisites and follow the installation section.

Prerequisites

This Project depends on the poetry.

  1. Install poetry, via homebrew or pipx:

    brew install poetry
    

    or

    pipx install poetry
    
  2. Don't forget to use the python environment you set before and, if using VScode, apply it there.

  3. It's optional, but we strongly recommend commitizen, which follows Conventional Commits

Installation

  1. Clone the repo

    git clone https://github.com/pontikos-lab/process-dcm
    cd process-dcm
    

Bumping Version

We use commitizen. The instructions below are only for exceptional cases.

  1. Using poetry-bumpversion. Bump the version number by running poetry version [part] [--dry-run] where [part] is major, minor, or patch, depending on which part of the version number you want to bump.

    Use --dry-run option to check it in advance.

  2. Push the tagged commit created above and the tag itself, i.e.:

    ver_tag=$(poetry version | cut -d ' ' -f2)
    git tag -a v"$ver_tag" -m "Tagged version $ver_tag"
    git push
    git push --tags
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

process_dcm-0.4.5.tar.gz (16.6 kB view details)

Uploaded Source

Built Distribution

process_dcm-0.4.5-py3-none-any.whl (16.4 kB view details)

Uploaded Python 3

File details

Details for the file process_dcm-0.4.5.tar.gz.

File metadata

  • Download URL: process_dcm-0.4.5.tar.gz
  • Upload date:
  • Size: 16.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.10 Linux/6.5.0-1025-azure

File hashes

Hashes for process_dcm-0.4.5.tar.gz
Algorithm Hash digest
SHA256 fd1333506a9735bd2ca77cc0504d802693f9425ea45d64931752b146494bcad8
MD5 0d734928088a5db554f5eb8628d17b87
BLAKE2b-256 1abac5346fe637fb11b211391a2e0cd5b7af3c8a53b2d75df68bfd2f640ff0d7

See more details on using hashes here.

File details

Details for the file process_dcm-0.4.5-py3-none-any.whl.

File metadata

  • Download URL: process_dcm-0.4.5-py3-none-any.whl
  • Upload date:
  • Size: 16.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.10 Linux/6.5.0-1025-azure

File hashes

Hashes for process_dcm-0.4.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d00a449cd69e3c013554e91629b527c5d0dbf0f53c124b69300f5ec2c4840bcb
MD5 2708d1947567086ae0ca7ead78f6f864
BLAKE2b-256 29ef25257a545e7a5d6a8173e70b722fd9e139a4cece9dc08c04dc0dd1eca207

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page