Skip to main content

Gaussian processes for image analysis

Project description

Downloads PyPI version Build Status Documentation Status Codacy Badge

Colab Gitpod ready-to-code

What is GPim

GPim is a python package that provides an easy way to apply Gaussian processes (GP) in Pyro and Gpytorch to images and hyperspectral data and to perform GP-based Bayesian optimization on grid data. The intended audience are domain scientists (for example, microscopists) with a basic knowledge of how to work with numpy arrays in Python.

Scientific papers where GPim was used:

Installation

First install PyTorch. Then install GPim using

pip install gpim

How to use

GP reconstruction

Below is a simple example of applying GPim to reconstructing a sparse 2D image. It can be similarly applied to 3D and 4D hyperspectral data. The missing data points in sparse data must be represented as NaNs. In the absense of missing observation GPim can be used for image and spectroscopic data cleaning/smoothing in all the dimensions simultaneously, as well as for the resolution enhancement.

import gpim
import numpy as np

# # Load dataset
R = np.load('sparse_exp_data.npy') 

# Get full (ideal) grid indices
X_full = gpim.utils.get_full_grid(R, dense_x=1)
# Get sparse grid indices
X_sparse = gpim.utils.get_sparse_grid(R)
# Kernel lengthscale constraints (optional)
lmin, lmax = 1., 4.
lscale = [[lmin, lmin], [lmax, lmax]] 

# Run GP reconstruction to obtain mean prediction and uncertainty for each predictied point
mean, sd, hyperparams = gpim.reconstructor(
    X_sparse, R, X_full, lengthscale=lscale,
    learning_rate=0.1, iterations=250, 
    use_gpu=True, verbose=False).run()

# Plot reconstruction results
gpim.utils.plot_reconstructed_data2d(R, mean, cmap='jet')
# Plot evolution of kernel hyperparameters during training
gpim.utils.plot_kernel_hyperparams(hyperparams)

GP-based Bayesian optimization

When performing measurements (real or simulated), one can use the information about the expected function value and uncertainty in GP reconstruction to select the next measurement point. This is usually referred to as exploration-exploitation approach in the context of Bayesian optimization. A simple example with a "dummy" function is shown below.

import gpim
import numpy as np
np.random.seed(42)

# Create a dummy 2D function
def trial_func(idx):
    """
    Takes a list of indices as input and returns function value at these indices
    """
    def func(x0, y0, a, b, fwhm): 
        return np.exp(-4*np.log(2) * (a*(idx[0]-x0)**2 + b*(idx[1]-y0)**2) / fwhm**2)
    Z1 = func(5, 10, 1, 1, 4.5)
    Z2 = func(10, 8, 0.75, 1.5, 7)
    Z3 = func(18, 18, 1, 1.5, 10)
    return Z1 + Z2 + Z3

# Create an empty observation matrix
grid_size = 25
Z_sparse = np.ones((grid_size, grid_size)) * np.nan
# Seed it with several random observations
idx = np.random.randint(0, grid_size, size=(4, 2))
for i in idx:
    Z_sparse[tuple(i)] = trial_func(i) 

# Get full and sparse grid indices for GP
X_full = gpim.utils.get_full_grid(Z_sparse)
X_sparse= gpim.utils.get_sparse_grid(Z_sparse)
# Initialize Bayesian optimizer with an 'expected improvement' acquisition function
boptim = gpim.boptimizer(
    X_sparse, Z_sparse, X_full, 
    trial_func, acquisition_function='ei',
    exploration_steps=30,
    use_gpu=False, verbose=1)
# Run Bayesian optimization
boptim.run()

# Plot exploration history
gpim.utils.plot_query_points(boptim.indices_all, plot_lines=True)

Running GPim notebooks in the cloud

  1. Executable Google Colab notebook with the examples of applying GP to sparse spiral 2D scans in piezoresponse force microscopy (PFM), simulated 2D atomic image in electron microscopy, and hyperspectral 3D data in Band Excitation PFM.
  2. Executable Google Colab notebook with the example of applying "parallel" GP method to analysis of EELS data.
  3. Executable Google Colab notebook with the example of applying GP to 4D spectroscopic dataset for smoothing and resolution enhancement in contact Kelvin Probe Force Microscopy (cKPFM).
  4. Executable Google Colab notebook with a simple example of performing GP-based exploration-exploitation on a toy dataset.

Requirements

It is strongly recommended to run the codes with a GPU hardware accelerator (such as NVIDIA's P100 or V100 GPU). If you don't have a GPU on your local machine, you may rent a cloud GPU from Google Cloud AI Platform. Running the example notebooks one time from top to bottom will cost about 1 USD with a standard deep learning VM instance (one P100 GPU and 15 GB of RAM).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gpim-0.3.8.tar.gz (28.4 kB view details)

Uploaded Source

Built Distribution

gpim-0.3.8-py3-none-any.whl (34.5 kB view details)

Uploaded Python 3

File details

Details for the file gpim-0.3.8.tar.gz.

File metadata

  • Download URL: gpim-0.3.8.tar.gz
  • Upload date:
  • Size: 28.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.6

File hashes

Hashes for gpim-0.3.8.tar.gz
Algorithm Hash digest
SHA256 e2e0af327dc1c0770973d70915f3570f8f85b41f7fea78285b42684f92b90fc1
MD5 e61c640bdd82a00576e5551b8d303f7c
BLAKE2b-256 10a408058878214f9663742ef4fb27f6a7d61ab0db2b13c3b141d219c8513df1

See more details on using hashes here.

File details

Details for the file gpim-0.3.8-py3-none-any.whl.

File metadata

  • Download URL: gpim-0.3.8-py3-none-any.whl
  • Upload date:
  • Size: 34.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.6

File hashes

Hashes for gpim-0.3.8-py3-none-any.whl
Algorithm Hash digest
SHA256 1284b05aff1a637602a42970dace83fa7a021374ac5143c9b01e288b5cbfb575
MD5 a9c3448353b807f1c24bb86bb9385724
BLAKE2b-256 66bd662bea463c2f404ce47e747c3d38a8a8a95fbdf8e00efcbc4073d292489d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page