Improved Kernel PLS and Fast Cross-Validation.
Project description
Improved Kernel Partial Least Squares (IKPLS) and Fast Cross-Validation
The ikpls software package provides fast and efficient tools for PLS (Partial Least Squares) modeling. This package is designed to help researchers and practitioners handle PLS modeling faster than previously possible - particularly on large datasets.
Citation
If you use the ikpls software package for your work, please cite this Journal of Open Source Software article. If you use the fast cross-validation algorithm implemented in ikpls.fast_cross_validation.numpy_ikpls, please also cite this Journal of Chemometrics article.
Unlock the Power of Fast and Stable Partial Least Squares Modeling with IKPLS
Dive into cutting-edge Python implementations of the IKPLS (Improved Kernel Partial Least Squares) Algorithms #1 and #2 [1] for CPUs, GPUs, and TPUs. IKPLS is both fast [2] and numerically stable [3] making it optimal for PLS modeling.
- Use our NumPy [4] based CPU implementations for seamless integration with scikit-learn's [5] ecosystem of machine learning algorithms and pipelines. As the implementations subclass scikit-learn's BaseEstimator, they can be used with scikit-learn's cross_validate.
- Use our JAX [6] implementations on CPUs or leverage powerful GPUs and TPUs for PLS modelling. Our JAX implementations are end-to-end differentaible allowing gradient propagation when using PLS as a layer in a deep learning model.
- Use our combination of IKPLS with Engstrøm's and Jensen's unbelievably fast cross-validation algorithm [7] to quickly determine the optimal combination of preprocessing and number of PLS components.
- Use any of the above in combination with sample-weighted PLS [8].
- Use our NumPy or JAX implementations for dimensionality reduction to score space with their respective transform methods.
- Use our NumPy or JAX implementations for reconstruction of original space from score space with their respective inverse_transform methods.
The documentation is available at https://ikpls.readthedocs.io/en/latest/; examples can be found at https://github.com/Sm00thix/IKPLS/tree/main/examples.
Fast Cross-Validation
In addition to the standalone IKPLS implementations, this package
contains an implementation of IKPLS combined with the novel, fast cross-validation
algorithm by Engstrøm and Jensen [7]. The fast cross-validation
algorithm benefit both IKPLS Algorithms and especially Algorithm #2. The fast
cross-validation algorithm is mathematically equivalent to the
classical cross-validation algorithm. Still, it is much quicker.
The fast cross-validation algorithm correctly handles (column-wise)
centering and scaling of the $\mathbf{X}$ and $\mathbf{Y}$ input matrices using training set means and
standard deviations to avoid data leakage from the validation set. This centering
and scaling can be enabled or disabled independently from eachother and for $\mathbf{X}$ and $\mathbf{Y}$
by setting the parameters center_X, center_Y, scale_X, and scale_Y, respectively.
In addition to correctly handling (column-wise) centering and scaling,
the fast cross-validation algorithm correctly handles row-wise preprocessing
that operates independently on each sample such as (row-wise) centering and scaling
of the $\mathbf{X}$ and $\mathbf{Y}$ input matrices, convolution, or other preprocessing. Row-wise
preprocessing can safely be applied before passing the data to the fast
cross-validation algorithm.
Installation
-
Install the package for Python3 using the following command:
pip3 install ikpls
-
Now you can import the NumPy implementations with:
from ikpls.numpy_ikpls import PLS as NpPLS from ikpls.fast_cross_validation.numpy_ikpls import PLS as NpPLS_FastCV
-
You can also install the optional JAX dependency to get JAX implementations of IKPLS
pip3 install "ikpls[jax]"
-
Now, you can import the JAX implementations with:
from ikpls.jax_ikpls_alg_1 import PLS as JAXPLS_Alg_1 from ikpls.jax_ikpls_alg_2 import PLS as JAXPLS_Alg_2
Prerequisites for JAX
The JAX implementations support running on both CPU, GPU, and TPU.
-
To enable NVIDIA GPU execution, install JAX and CUDA with:
pip3 install -U "jax[cuda13]"
-
To enable Google Cloud TPU execution, install JAX with:
pip3 install -U "jax[tpu]"
These are typical installation instructions that will be what most users are looking for. For customized installations, follow the instructions from the JAX Installation Guide.
To ensure that JAX implementations use float64, set the environment
variable JAX_ENABLE_X64=True as per the Common
Gotchas.
Alternatively, float64 can be enabled with the following function call:
import jax
jax.config.update("jax_enable_x64", True)
Quick Start
Use the ikpls package for PLS modeling
import numpy as np from ikpls.numpy_ikpls import PLS N = 100 # Number of samples. K = 50 # Number of features. M = 10 # Number of targets. A = 20 # Number of latent variables (PLS components). X = np.random.uniform(size=(N, K)) # Predictor variables Y = np.random.uniform(size=(N, M)) # Target variables w = np.random.uniform(size=(N, )) # sample weights # The other PLS algorithms and implementations have the same interface for fit() # and predict(). The fast cross-validation implementation with IKPLS has a # different interface. np_ikpls_alg_1 = PLS(algorithm=1) np_ikpls_alg_1.fit(X, Y, A, w) # Has shape (A, N, M) = (20, 100, 10). Contains a prediction for all possible # numbers of components up to and including A. y_pred = np_ikpls_alg_1.predict(X) # Has shape (N, M) = (100, 10). y_pred_20_components = np_ikpls_alg_1.predict(X, n_components=20) (y_pred_20_components == y_pred[19]).all() # True # The internal model parameters can be accessed as follows: # Regression coefficients tensor of shape (A, K, M) = (20, 50, 10). np_ikpls_alg_1.B # X weights matrix of shape (K, A) = (50, 20). np_ikpls_alg_1.W # X loadings matrix of shape (K, A) = (50, 20). np_ikpls_alg_1.P # Y loadings matrix of shape (M, A) = (10, 20). np_ikpls_alg_1.Q # X rotations matrix of shape (K, A) = (50, 20). np_ikpls_alg_1.R # Mapping from n_components to Y rotations matrix of shape (M, n_components). # This is not required to compute np_ikpls_alg_1.B and is therefore lazily evaluated and cached. np_ikpls_alg_1.R_Y # Y rotations matrix of shape (M, A) = (10, 20) np_ikpls_alg_1.R_Y[20] # R_Y is now cached for 20 components. # Y rotations matrix for 19 components of shape (M, 19) = (10, 19) # This is NOT the same as np_ikpls_alg_1.R_Y[20][:, :19] np_ikpls_alg_1.R_Y[19] # R_Y is now cached for 20 and 19 components. # X scores matrix of shape (N, A) = (100, 20). # This is only computed for IKPLS Algorithm #1. np_ikpls_alg_1.T
Examples
In examples, you will find:
- Fit and Predict with NumPy.
- Fit and Predict with JAX.
- Cross-validate with NumPy.
- Cross-validate with NumPy and scikit-learn.
- Cross-validate with NumPy and fast cross-validation.
- Cross-validate with NumPy and weighted fast cross-validation.
- Cross-validate with JAX.
- Compute the gradient of a preprocessing convolution filter with respect to the RMSE between the target value and the value predicted by PLS after fitting with JAX.
- Weighted Fit and Predict with NumPy.
- Weighted Fit and Predict with JAX.
- Weighted cross-validation with NumPy.
- Weighted cross-validation with JAX.
- Fit, transform to score space, and inverse transform with NumPy.
- Fit, transform to score space, and inverse transform with JAX.
Changelog
See CHANGELOG.md for version history and release notes.
Contribute
To contribute, please read the Contribution Guidelines.
References
- Dayal, B. S. and MacGregor, J. F. (1997). Improved PLS algorithms. Journal of Chemometrics, 11(1), 73-85.
- Alin, A. (2009). Comparison of PLS algorithms when the number of objects is much larger than the number of variables. Statistical Papers, 50, 711-720.
- Andersson, M. (2009). A comparison of nine PLS1 algorithms. Journal of Chemometrics, 23(10), 518-529.
- NumPy
- scikit-learn
- JAX
- Engstrøm, O.-C. G. and Jensen, M. H. (2025). Fast Partition-Based Cross-Validation With Centering and Scaling for $\mathbf{X}^\mathbf{T}\mathbf{X}$ and $\mathbf{X}^\mathbf{T}\mathbf{Y}$
- Becker and Ismail (2016). Accounting for sampling weights in PLS path modeling: Simulations and empirical examples. European Management Journal, 34(6), 606-617.
Funding
- Up until May 31st 2025, this work has been carried out as part of an industrial PhD project receiving funding from FOSS Analytical A/S and The Innovation Fund Denmark. Grant number 1044-00108B.
- From June 1st 2025 and onward, this work is sponsored by FOSS Analytical A/S.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ikpls-4.0.1.post1.tar.gz.
File metadata
- Download URL: ikpls-4.0.1.post1.tar.gz
- Upload date:
- Size: 37.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bcf00d08e34505d9ed4557e7e23590266c534a1f25de1f21a315b624c8298135
|
|
| MD5 |
96c197882d35c52091365ac93e5bcac0
|
|
| BLAKE2b-256 |
8bd6458f19a9844edc206b263f6bfc6b0470867406fe1f7d1c14b7a1ebc3f075
|
Provenance
The following attestation bundles were made for ikpls-4.0.1.post1.tar.gz:
Publisher:
package_workflow.yml on sm00thix/ikpls
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ikpls-4.0.1.post1.tar.gz -
Subject digest:
bcf00d08e34505d9ed4557e7e23590266c534a1f25de1f21a315b624c8298135 - Sigstore transparency entry: 896554694
- Sigstore integration time:
-
Permalink:
sm00thix/ikpls@e038d1fbc5aa19c48d9e243f1c73c1524bd13571 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/sm00thix
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
package_workflow.yml@e038d1fbc5aa19c48d9e243f1c73c1524bd13571 -
Trigger Event:
workflow_run
-
Statement type:
File details
Details for the file ikpls-4.0.1.post1-py3-none-any.whl.
File metadata
- Download URL: ikpls-4.0.1.post1-py3-none-any.whl
- Upload date:
- Size: 43.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
318ff5f0dc935709aa042096569f5c39f0c9291cb9197307f1bee246fd9af3bc
|
|
| MD5 |
ff9a1d8556802861ff83925aa3fbfd2d
|
|
| BLAKE2b-256 |
c6ad3fa56d595bb7d169a3cf1c461799b5a91bd3b929bb9d3973855ddab156af
|
Provenance
The following attestation bundles were made for ikpls-4.0.1.post1-py3-none-any.whl:
Publisher:
package_workflow.yml on sm00thix/ikpls
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ikpls-4.0.1.post1-py3-none-any.whl -
Subject digest:
318ff5f0dc935709aa042096569f5c39f0c9291cb9197307f1bee246fd9af3bc - Sigstore transparency entry: 896554779
- Sigstore integration time:
-
Permalink:
sm00thix/ikpls@e038d1fbc5aa19c48d9e243f1c73c1524bd13571 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/sm00thix
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
package_workflow.yml@e038d1fbc5aa19c48d9e243f1c73c1524bd13571 -
Trigger Event:
workflow_run
-
Statement type: