Gaussian processes for image analysis
Project description
Under active development (expect some breaking changes)
What is GPim
GPim is a python package that provides an easy way to apply Gaussian processes (GP) in Pyro and Gpytorch to images and hyperspectral data and to perform GPbased Bayesian optimization on grid data. The intended audience are domain scientists (for example, microscopists) with a basic knowledge of how to work with numpy arrays in Python.
Scientific papers where GPim was used:

GP for 3D hyperspectral microscopy data: paper

GP for 4D hyperspectral microscopy data: paper

GP and GPbased BO for Ising model: paper

GPbased BO for hysteresis loop engineering in ferroelectrics: paper
Installation
First install PyTorch. Then install GPim using
pip install gpim
How to use
GP reconstruction
Below is a simple example of applying GPim to reconstructing a sparse 2D image. It can be similarly applied to 3D and 4D hyperspectral data. The missing data points in sparse data must be represented as NaNs. In the absense of missing observation GPim can be used for image and spectroscopic data cleaning/smoothing in all the dimensions simultaneously, as well as for the resolution enhancement.
import gpim import numpy as np # # Load dataset R = np.load('sparse_exp_data.npy') # Get full (ideal) grid indices X_full = gpim.utils.get_full_grid(R, dense_x=1) # Get sparse grid indices X_sparse = gpim.utils.get_sparse_grid(R) # Kernel lengthscale constraints (optional) lmin, lmax = 1., 4. lscale = [[lmin, lmin], [lmax, lmax]] # Run GP reconstruction to obtain mean prediction and uncertainty for each predictied point mean, sd, hyperparams = gpim.reconstructor( X_sparse, R, X_full, lengthscale=lscale, learning_rate=0.1, iterations=250, use_gpu=True, verbose=False).run() # Plot reconstruction results gpim.utils.plot_reconstructed_data2d(R, mean, cmap='jet') # Plot evolution of kernel hyperparameters during training gpim.utils.plot_kernel_hyperparams(hyperparams)
GPbased Bayesian optimization
When performing measurements (real or simulated), one can use the information about the expected function value and uncertainty in GP reconstruction to select the next measurement point. This is usually referred to as explorationexploitation approach in the context of Bayesian optimization. A simple example with a "dummy" function is shown below.
import gpim import numpy as np np.random.seed(42) # Create a dummy 2D function def trial_func(idx): """Takes a list of indices as input and returns function value at these indices""" return np.exp(4*np.log(2) * ((idx[0]5)**2 + (idx[1]10)**2) / 4.5**2) # Create an empty observation matrix grid_size = 25 Z_sparse = np.ones((grid_size, grid_size)) * np.nan # Seed it with several random observations idx = np.random.randint(0, grid_size, size=(4, 2)) for i in idx: Z_sparse[tuple(i)] = trial_func(i) # Get full and sparse grid indices for GP X_full = gpim.utils.get_full_grid(Z_sparse) X_sparse= gpim.utils.get_sparse_grid(Z_sparse) # Initialize Bayesian optimizer with an 'expected improvement' acquisition function boptim = gpim.boptimizer( X_sparse, Z_sparse, X_full, trial_func, acquisition_function='ei', exploration_steps=30, use_gpu=False, verbose=1) # Run Bayesian optimization boptim.run() # Plot exploration history gpim.utils.plot_query_points(boptim.indices_all, plot_lines=True)
Running GPim notebooks in the cloud
 Executable Google Colab notebook with the examples of applying GP to sparse spiral 2D scans in piezoresponse force microscopy (PFM), simulated 2D atomic image in electron microscopy, and hyperspectral 3D data in Band Excitation PFM.
 Executable Google Colab notebook with the example of applying "parallel" GP method to analysis of EELS data.
 Executable Google Colab notebook with the example of applying GP to 4D spectroscopic dataset for smoothing and resolution enhancement in contact Kelvin Probe Force Microscopy (cKPFM).
 Executable Google Colab notebook with a simple example of performing GPbased explorationexploitation on a toy dataset.
Requirements
It is strongly recommended to run the codes with a GPU hardware accelerator (such as NVIDIA's P100 or V100 GPU). If you don't have a GPU on your local machine, you may rent a cloud GPU from Google Cloud AI Platform. Running the example notebooks one time from top to bottom will cost about 1 USD with a standard deep learning VM instance (one P100 GPU and 15 GB of RAM).
Project details
Release history Release notifications  RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Filename, size  File type  Python version  Upload date  Hashes 

Filename, size gpim0.3.5py3noneany.whl (33.7 kB)  File type Wheel  Python version py3  Upload date  Hashes View 
Filename, size gpim0.3.5.tar.gz (27.4 kB)  File type Source  Python version None  Upload date  Hashes View 