Skip to main content

Radiomics-related modules for extraction and experimenting

Project description


AutoRadiomics

License CI Build codecov

Simple pipeline for experimenting with radiomics features

Streamlit Share

Docker

Python

Demo

docker run -p 8501:8501 -v <your_data_dir>:/data -it piotrekwoznicki/autorad:0.2 pip install --upgrade autorad

 

Installation from source

git clone https://github.com/pwoznicki/AutoRadiomics.git
cd AutoRadiomics
pip install -e .

Example - Hydronephrosis detection from CT images:

Extract radiomics features

df = pd.read_csv(table_dir / "paths.csv")
image_dataset = ImageDataset(
    df=df,
    image_colname="image path",
    mask_colname="mask path",
    ID_colname="patient ID"
)
extractor = FeatureExtractor(
    dataset=image_dataset,
    out_path=(table_dir / "features.csv"),
)
extractor.extract_features()

Load, split and preprocess extracted features

# Create a dataset from the radiomics features
feature_df = pd.read_csv(table_dir / "features.csv")
feature_dataset = FeatureDataset(
    dataframe=feature_df,
    target="Hydronephrosis",
    task_name="Hydronephrosis detection"
)

# Split data and load splits
splits_path = result_dir / "splits.json"
feature_dataset.full_split(save_path=splits_path)
feature_dataset.load_splits_from_json(splits_path)

# Preprocessing
preprocessor = Preprocessor(
    normalize=True,
    feature_selection_method="boruta",
    oversampling_method="SMOTE",
)
feature_dataset._data = preprocessor.fit_transform(dataset.data)

Train the model for hydronephrosis classification

# Select classifiers to compare
classifier_names = [
    "Gaussian Process Classifier",
    "Logistic Regression",
    "SVM",
    "Random Forest",
    "XGBoost",
]
classifiers = [MLClassifier.from_sklearn(name) for name in classifier_names]

model = MLClassifier.from_sklearn(name="Random Forest")
model.set_optimizer("optuna", n_trials=5)

trainer = Trainer(
    dataset=dataset,
    models=[model],
    result_dir=result_dir,
    experiment_name="Hydronephrosis detection"
)
trainer.run()

Create an evaluator to train and evaluate selected classifiers

evaluator = Evaluator(dataset=data, models=classifiers)
evaluator.evaluate_cross_validation()
evaluator.boxplot_by_class()
evaluator.plot_all_cross_validation()
evaluator.plot_test()

Commands

MLFlow

mlflow server -h 0.0.0.0 -p 5000 --backend-store-uri <result_dir>

Dependencies:

  • MONAI
  • pyRadiomics
  • MLFlow
  • Optuna
  • scikit-learn
  • imbalanced-learn
  • XGBoost
  • Boruta
  • Medpy
  • NiBabel
  • nilearn
  • plotly
  • seaborn

App dependencies:

  • Streamlit
  • Docker

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autorad-0.1.2.tar.gz (46.1 kB view details)

Uploaded Source

Built Distribution

autorad-0.1.2-py3-none-any.whl (54.3 kB view details)

Uploaded Python 3

File details

Details for the file autorad-0.1.2.tar.gz.

File metadata

  • Download URL: autorad-0.1.2.tar.gz
  • Upload date:
  • Size: 46.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.2

File hashes

Hashes for autorad-0.1.2.tar.gz
Algorithm Hash digest
SHA256 3c450204bb9657adca8993ba54547afa8dbd8be6e5a27c135c347d12e2d714db
MD5 0542d2cbffc2bd910701ce9b17767031
BLAKE2b-256 1b3d2dbecfe10141fb49e858c7d3435a13e2d1c9d7c5336e1869247b8214a3f0

See more details on using hashes here.

File details

Details for the file autorad-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: autorad-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 54.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.2

File hashes

Hashes for autorad-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d997c1f2a0fc37b27323edda60e29e7264c3b07e30f776391b62e2f664ce02ec
MD5 82f9470218c53bdbdbb4b5aeb8ec2378
BLAKE2b-256 d8b7d662032b352285116098038bebf40bc44d8df2fb264c4d5cc965d7bd8c5e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page