Multi-Output Gaussian Process ToolKit
Project description
Multi-Output Gaussian Process Toolkit
Paper - API Documentation - Tutorials & Examples
The Multi-Output Gaussian Process Toolkit is a Python toolkit for training and interpreting Gaussian process models with multiple data channels. It builds upon PyTorch to provide an easy way to train multi-output models effectively on CPUs and GPUs. The main authors are Taco de Wolff, Alejandro Cuevas, and Felipe Tobar as part of the Center for Mathematical Modelling at the University of Chile.
Installation
With Anaconda installed on your system, open a command prompt and create a virtual environment:
conda create -n myenv python=3.7
conda activate myenv
where myenv
is the name of your environment, and where the version of Python could be 3.6 or above. Next we will install this toolkit and automatically install the necessary dependencies such as PyTorch.
pip install mogptk
In order to upgrade to a new version of MOGPTK or any of its dependencies, use --upgrade
as follows:
pip install --upgrade mogptk
For developers of the library or for users who need the latest changes, we recommend cloning the git master
or develop
branch and to use the following command inside the repository folder:
pip install --upgrade -e .
See Tutorials & Examples to get started.
Introduction
This repository provides a toolkit to perform multi-output GP regression with kernels that are designed to utilize correlation information among channels in order to better model signals. The toolkit is mainly targeted to time-series, and includes plotting functions for the case of single input with multiple outputs (time series with several channels).
The main kernel corresponds to Multi Output Spectral Mixture Kernel, which correlates every pair of data points (irrespective of their channel of origin) to model the signals. This kernel is specified in detail in the following publication: G. Parra, F. Tobar, Spectral Mixture Kernels for Multi-Output Gaussian Processes, Advances in Neural Information Processing Systems, 2017. Proceedings link: https://papers.nips.cc/paper/7245-spectral-mixture-kernels-for-multi-output-gaussian-processes
The kernel learns the cross-channel correlations of the data, so it is particularly well-suited for the task of signal reconstruction in the event of sporadic data loss. All other included kernels can be derived from the Multi Output Spectral Mixture kernel by restricting some parameters or applying some transformations.
One of the main advantages of the present toolkit is the GPU support, which enables the user to train models through PyTorch, speeding computations significantly. It also includes sparse-variational GP regression functionality to decrease computation time even further.
See MOGPTK: The Multi-Output Gaussian Process Toolkit for our publication in Neurocomputing.
Implementation
Implemented models:
- Exact
- Snelson (E. Snelson, Z. Ghahramani, "Sparse Gaussian Processes using Pseudo-inputs", 2005)
- OpperArchambeau (M. Opper, C. Archambeau, "The Variational Gaussian Approximation Revisited", 2009)
- Titsias (Titsias, "Variational learning of induced variables in sparse Gaussian processes", 2009)
- Hensman (J. Hensman, et al., "Scalable Variational Gaussian Process Classification", 2015)
Implemented likelihoods:
- Gaussian
- Student-T
- Exponential
- Laplace
- Bernoulli
- Beta
- Gamma
- Poisson
- Weibull
- Log-Logistic
- Log-Gaussian
- Chi
- Chi-Squared
Tutorials
00 - Quick Start: Short notebook showing the basic use of the toolkit.
01 - Data Loading: Functionality to load CSVs and DataFrames while using formatters for dates.
02 - Data Preparation: Handle data, removing observations to simulate sensor failure and apply tranformations to the data.
03 - Parameter Initialization: Parameter initialization using different methods, for single output regression using spectral mixture kernel and multioutput case using MOSM kernel.
04 - Model Training: Training of models while keeping certain parameters fixed.
05 - Error Metrics Obtain different metrics in order to compare models.
06 - Custom Kernels and Mean Functions Use or create custom kernels as well as training custom mean functions.
07 - Sparse Multi Input Use 8 input dimensions to train the Abalone data set using sparse GPs.
08 - Multi Likelihood Classification Use a different likelihood for each channel, one Bernoulli for classification and one StudentT's for regression.
Examples
Airline passengers: Regression using a single output spectral mixture on the yearly number of passengers of an airline.
Seasonal CO2 of Mauna-Loa: Regression using a single output spectral mixture on the CO2 concentration at Mauna-Loa throughout many years.
Currency Exchange: Model training, interpretation and comparison on a dataset of 11 currency exchange rates (against the dollar) from 2017 and 2018. These 11 channels are fitted with the MOSM, SM-LMC, CSM, and CONV kernels and their results are compared and interpreted.
Gold, Oil, NASDAQ, USD-index: The commodity indices for gold and oil, together with the indices for the NASDAQ and the USD against a basket of other currencies, we train multiple models to find correlations between the macro economic indicators.
Human Activity Recognition: Using the Inertial Measurement Unit (IMU) of an Apple iPhone 4, the accelerometer, gyroscope and magnetometer 3D data were recorded for different activities resulting in nine channels.
Bramblemet tidal waves: Tidal wave data set of four locations in the south of England. We model the tidal wave periods of approximately 12.5 hours using different multi-output Gaussian processes.
Documentation
See the API documentation for documentation of our toolkit, including usage and examples of functions and classes.
Authors
- Taco de Wolff
- Alejandro Cuevas
- Felipe Tobar
Users
This is a list of users of this toolbox, feel free to add your project!
Contributing
We accept and encourage contributions to the toolkit in the form of pull requests (PRs), bug reports and discussions (GitHub issues). It is adviced to start an open discussion before proposing large PRs. For small PRs we suggest that they address only one issue or add one new feature. All PRs should keep documentation and notebooks up to date.
Citing
Please use our publication at arXiv to cite our toolkit: MOGPTK: The Multi-Output Gaussian Process Toolkit. We recommend the following BibTeX entry:
@article{mogptk,
author = {T. {de Wolff} and A. {Cuevas} and F. {Tobar}},
title = {{MOGPTK: The Multi-Output Gaussian Process Toolkit}},
journal = "Neurocomputing",
year = "2020",
issn = "0925-2312",
doi = "https://doi.org/10.1016/j.neucom.2020.09.085",
url = "https://github.com/GAMES-UChile/mogptk"
}
Citations
- A.I. Cowen-Rivers, et al., SAMBA: Safe Model-Based & Active Reinforcement Learning
- O.A. Guerrero, et al., Subnational Sustainable Development: The Role of Vertical Intergovernmental Transfers in Reaching Multidimensional Goals
- O.A. Guerrero, G. Castañeda, How Does Government Expenditure Impact Sustainable Development? Studying the Multidimensional Link between Budgets and Development Gaps
- T.V. Vo, et al., Federated Estimation of Causal Effects from Observational Data
- Q. Lin, et al., Multi-output Gaussian process prediction for computationally expensive problems with multiple levels of fidelity
- S. Covino, et al., Detecting the periodicity of highly irregularly sampled light-curves with GPs
- Y. Jung, J. Park, Scalable Inference for Hybrid Bayesian HMM using GP Emission
- H. Liu, et al., Scalable multi-task GPs with neural embedding of coregionalization
- L.M. Rivera-Muñoz, et al., Missing Data Estimation in a Low-Cost Sensor network for Measuring Air Quality
- G. Caballero, et al., Synergy of Sentinel-1 and Sentinel-2 Time Series for Cloud-Free Vegetation Water Content Mapping with Multi-Output Gaussian Processes
Used in code
- https://github.com/jdjmoon/TRF
- https://github.com/ErickRosete/Multivariate_regression
- https://github.com/clara-risk/fire_weather_interpolate
- https://github.com/becre2021/multichannels-corrnp
- https://github.com/ArthurLeroy/MAGMAclust
- https://github.com/nicdel-git/master_thesis
License
Released under the MIT license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mogptk-0.5.2.tar.gz
.
File metadata
- Download URL: mogptk-0.5.2.tar.gz
- Upload date:
- Size: 83.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 78075779e9a4d46f24e9ca84030c9265e8d546b124be5ce0690d3799645431ed |
|
MD5 | ff69a5b9fd0071faa1d46201cdf20cf0 |
|
BLAKE2b-256 | 0a51912951e255d5246b9c49871c07f1ebaad440f3acbebd6775ec787f3958ba |
File details
Details for the file mogptk-0.5.2-py3-none-any.whl
.
File metadata
- Download URL: mogptk-0.5.2-py3-none-any.whl
- Upload date:
- Size: 129.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bf2d1d593dff9c28325022b477f47dcc9de7c5f4ff3473f78aef0ad422c80586 |
|
MD5 | 9b72588b1d8908a54d4e5f116423a369 |
|
BLAKE2b-256 | 415a02fbc8869cdfdcdad978f0af02443ecd2ecee6ea29b25f731054f9b5c7da |