Skip to main content

Python library and app to extract images from DCM in private-eye format

Project description

Process DCM

Maintenance GitHub GitHub release (latest by date) GitHub Release PyPI Poetry Ruff pre-commit

About The Project

Python library and app to extract images from DCM files with metadata in private-eye format

Installation and Usage

pip install process-dcm
 Usage: process-dcm [OPTIONS] INPUT_DIR

 Process DICOM files in subfolders, extract images and metadata using parallel processing.
 Version: 0.4.0

╭─ Arguments ──────────────────────────────────────────────────────────────────────────────────────────╮
│ *    input_dir      TEXT  Input directory containing subfolders with DICOM files. [default: None]    │
│                           [required]                                                                 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ────────────────────────────────────────────────────────────────────────────────────────────╮
│ --image_format        -f      TEXT     Image format for extracted images (png, jpg, webp). Defaults  │
│                                        to: png                                                       │
│                                        [default: png]                                                │
│ --output_dir          -o      TEXT     Output directory for extracted images and metadata. Defaults  │
│                                        to: exported_data                                             │
│                                        [default: exported_data]                                      │
│ --group               -g               Re-group DICOM files in a given folder by                     │
│                                        AcquisitionDateTime.                                          │
│ --relative            -r               Save extracted data in folders relative to _input_dir_.       │
│ --n_jobs              -j      INTEGER  Number of parallel jobs. Defaults to: 1 [default: 1]          │
│ --mapping             -m      TEXT     Path to CSV containing patient_id to study_id mapping. If not │
│                                        provided and patient_id is not anonymised, a                  │
│                                        'patient_2_study_id.csv' file will be generated               │
│ --keep                -k      TEXT     Keep the specified fields (p: patient_key, n: names, d:       │
│                                        date_of_birth, D: year-only DOB, g: gender)                   │
│ --overwrite           -w               Overwrite existing images if found.                           │
│ --quiet               -q               Silence verbosity.                                            │
│ --version             -V               Prints app version.                                           │
│ --install-completion                   Install completion for the current shell.                     │
│ --show-completion                      Show completion for the current shell, to copy it or          │
│                                        customize the installation.                                   │
│ --help                -h               Show this message and exit.                                   │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────╯

For Developers

To run this project locally, you will need to install the prerequisites and follow the installation section.

Prerequisites

This Project depends on the poetry.

  1. Install poetry, via homebrew or pipx:

    brew install poetry
    

    or

    pipx install poetry
    
  2. Don't forget to use the python environment you set before and, if using VScode, apply it there.

  3. It's optional, but we strongly recommend commitizen, which follows Conventional Commits

Installation

  1. Clone the repo

    git clone https://github.com/pontikos-lab/process-dcm
    cd process-dcm
    

Bumping Version

We use commitizen. The instructions below are only for exceptional cases.

  1. Using poetry-bumpversion. Bump the version number by running poetry version [part] [--dry-run] where [part] is major, minor, or patch, depending on which part of the version number you want to bump.

    Use --dry-run option to check it in advance.

  2. Push the tagged commit created above and the tag itself, i.e.:

    ver_tag=$(poetry version | cut -d ' ' -f2)
    git tag -a v"$ver_tag" -m "Tagged version $ver_tag"
    git push
    git push --tags
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

process_dcm-0.4.2.tar.gz (16.0 kB view details)

Uploaded Source

Built Distribution

process_dcm-0.4.2-py3-none-any.whl (15.8 kB view details)

Uploaded Python 3

File details

Details for the file process_dcm-0.4.2.tar.gz.

File metadata

  • Download URL: process_dcm-0.4.2.tar.gz
  • Upload date:
  • Size: 16.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.10 Linux/6.8.0-1014-azure

File hashes

Hashes for process_dcm-0.4.2.tar.gz
Algorithm Hash digest
SHA256 ee250f679b470665f03b4a9fba02cc338618b37e6b41b4289ee52247d2eabd9e
MD5 e06aa09538b504f6ce908686c3610ad9
BLAKE2b-256 9689c6d28336c3fd7017b2c6a9f0c9cbb427874f9e1bb6ba48edfe25345264e0

See more details on using hashes here.

File details

Details for the file process_dcm-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: process_dcm-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 15.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.10 Linux/6.8.0-1014-azure

File hashes

Hashes for process_dcm-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9d26044dcf361a464ccac91a235530a4c87f07e16b420fd2276eb2021e2c4915
MD5 a26570ca093738adcbcbdbd489581349
BLAKE2b-256 303ed09ecc24db9a49bda434c10251c7ebc9a12aa413c307ae0633ea07d5059f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page