Skip to main content

Python library and app to extract images from DCM in a JSON-based standard format

Project description

Process DCM

Maintenance GitHub GitHub release (latest by date) GitHub Release PyPI Poetry Ruff pre-commit

About The Project

Python library and app to extract images from DCM files with metadata in a JSON-based standard format

Installation and Usage

pip install process-dcm
 Usage: process-dcm [OPTIONS] INPUT_PATH

 Process DICOM files in subfolders, extract images and metadata.
 Version: 0.9.0

╭─ Arguments ───────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ *    input_path      PATH  Input path to either a DCM file or a folder containing DICOM files. [default: None] [required] │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ --image_format        -f      TEXT     Image format for extracted images (png, jpg, webp). [default: png]                 │
│ --output_dir          -o      PATH     Output directory for extracted images and metadata. [default: exported_data]       │
│ --group               -g               Re-group DICOM files in a given folder by AcquisitionDateTime.                     │
│ --tol                 -t      FLOAT    Tolerance in seconds for grouping DICOM files by AcquisitionDateTime. Only used    │
│                                        when --group is set.                                                               │
│                                        [default: None]                                                                    │
│ --n_jobs              -j      INTEGER  Number of parallel jobs. [default: 1]                                              │
│ --mapping             -m      TEXT     Path to CSV containing patient_id to study_id mapping. If not provided and         │
│                                        patient_id is anonymised, a 'study_2_patient.csv' file will be generated.          │
│ --keep                -k      TEXT     Keep the specified fields (p: patient_key, n: names, d: date_of_birth, D:          │
│                                        year-only DOB, g: gender)                                                          │
│ --overwrite           -w               Overwrite existing images if found.                                                │
│ --reset               -r               Reset the output directory if it exists.                                           │
│ --quiet               -q               Silence verbosity.                                                                 │
│ --version             -V               Prints app version.                                                                │
│ --install-completion                   Install completion for the current shell.                                          │
│ --show-completion                      Show completion for the current shell, to copy it or customize the installation.   │
│ --help                -h               Show this message and exit.                                                        │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

For Developers

To run this project locally, you will need to install the prerequisites and follow the installation section.

Prerequisites

This Project depends on the poetry.

  1. Install poetry, via homebrew or pipx:

    brew install poetry
    

    or

    pipx install poetry
    
  2. Don't forget to use the python environment you set before and, if using VScode, apply it there.

  3. It's optional, but we strongly recommend commitizen, which follows Conventional Commits

Installation

  1. Clone the repo

    git clone https://github.com/pontikos-lab/process-dcm
    cd process-dcm
    

Bumping Version

We use commitizen, which follows Conventional Commits. The instructions below are only for exceptional cases.

  1. Using poetry-bumpversion. Bump the version number by running poetry version [part] [--dry-run] where [part] is major, minor, or patch, depending on which part of the version number you want to bump.

    Use --dry-run option to check it in advance.

  2. Push the tagged commit created above and the tag itself, i.e.:

    ver_tag=$(poetry version | cut -d ' ' -f2)
    git tag -a v"$ver_tag" -m "Tagged version $ver_tag"
    git push
    git push --tags
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

process_dcm-0.10.0.tar.gz (17.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

process_dcm-0.10.0-py3-none-any.whl (18.0 kB view details)

Uploaded Python 3

File details

Details for the file process_dcm-0.10.0.tar.gz.

File metadata

  • Download URL: process_dcm-0.10.0.tar.gz
  • Upload date:
  • Size: 17.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.18 Linux/6.11.0-1015-azure

File hashes

Hashes for process_dcm-0.10.0.tar.gz
Algorithm Hash digest
SHA256 2296a31c5bb33aeee3453380a666e6df63ed247c96e27d7a4ff8f7352179299d
MD5 118c91d24c11fdebd2d5f58c00befa0c
BLAKE2b-256 10846635056c030bef62523240f689b5dd40451d44f0c8ad8bc63e6bdae08600

See more details on using hashes here.

File details

Details for the file process_dcm-0.10.0-py3-none-any.whl.

File metadata

  • Download URL: process_dcm-0.10.0-py3-none-any.whl
  • Upload date:
  • Size: 18.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.18 Linux/6.11.0-1015-azure

File hashes

Hashes for process_dcm-0.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 51fec359f4c0d703c92fdc675a2fb7564c4f359162be21f915b43d4e2c80e316
MD5 3db8e3cae531440825d3e46ad55e946f
BLAKE2b-256 1a1d482833b9c71bac7c9c7d495b6ad39428e49ca3f3000cb716868f603d741e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page