Skip to main content

Trainable Bilateral Filter Layer (PyTorch)

Project description

This repository contains our GPU-accelerated trainable bilateral filter layer (three spatial and one range filter dimension) that can be directly included in any Pytorch graph, just as any conventional layer (FCL, CNN, …). By calculating the analytical derivative of the bilateral filter with respect to its parameters and its input, the (so far) hyperparameters can be automatically optimized via backpropagation for a calculated loss.

Our corresponding paper Ultra low-parameter denoising: Trainable bilateral filter layers in computed tomography can be found on Medical Physics (open access) and arXiv (pre-print).

Citation:

If you find our code useful, please cite our work

@article{wagner2022ultra,
  title={Ultra low-parameter denoising: Trainable bilateral filter layers in computed tomography},
  author={Wagner, Fabian and Thies, Mareike and Gu, Mingxuan and Huang, Yixing and Pechmann, Sabrina and Patwari, Mayank and Ploner, Stefan and Aust, Oliver and Uderhardt, Stefan and Schett, Georg and Christiansen, Silke and Maier, Andreas},
  journal={Medical Physics},
  year={2022},
  doi={https://doi.org/10.1002/mp.15718}
}

Setup:

The C++/CUDA implemented forward and backward functions are compiled via the setup.py script using setuptools:

  1. Create and activate a python environment (python>=3.7).

  2. Install Torch (tested versions: 1.7.1, 1.9.0).

  3. Install the bilateral filter layer via pip:

pip install bilateralfilter_torch

In case you encounter problems with 3. install the layer directly from our GitHub repository:

  1. Download the repository.

  2. Navigate into the extracted repo.

  3. Compile/install the bilateral filter layer by calling

python setup.py install

Example scripts:

  • Can be found in our GitHub repository

  • Try out the forward pass by running the example_filter.py (requires Matplotlib and scikit-image).

  • Run the gradcheck.py script to verify the correct gradient implementation.

  • Run example_optimization.py to optimize the parameters of a bilateral filter layer to automatically denoise an image.

Optimized bilateral filter prediction:

https://github.com/faebstn96/trainable-bilateral-filter-source/blob/main/out/example_optimization.png?raw=true

Implementation:

The general structure of the implementation follows the PyTorch documentation for creating custom C++ and CUDA extensions. The forward pass implementation of the layer is based on code from the Project MONAI framework, originally published under the Apache License, Version 2.0. The correct implementation of the analytical forward and backward pass can be verified by running the gradcheck.py script, comparing numerical gradients with the derived analytical gradient using the PyTorch built-in gradcheck function.

Troubleshooting

  1. Compiling the filter layers requires the Nvidia CUDA toolkit. Check its version

    nvcc --version

    or install it via, e.g.,

    sudo apt update
    sudo apt install nvidia-cuda-toolkit
  2. The NVIDIA CUDA toolkit 11.6 made some problems on a Windows machine in combination with pybind. Downgrading the toolkit to version 11.3 fixed the problem (see this discussion). #### Windows-related problems: Make sure the cl.exe environment variable is correctly set.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bilateralfilter_torch-1.1.0.tar.gz (17.9 kB view details)

Uploaded Source

File details

Details for the file bilateralfilter_torch-1.1.0.tar.gz.

File metadata

File hashes

Hashes for bilateralfilter_torch-1.1.0.tar.gz
Algorithm Hash digest
SHA256 0098740c00c6465325d620523211fb92cd33f64051e2ed483cc83a7da53acfe4
MD5 edcd29244b89c78ca1de91c6bf4c9032
BLAKE2b-256 5c7ea5bf5b9f1855ee9fec57742f8fa68cfd870a46926efda874996717d031fd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page