Skip to main content

Towards Efficient Updating of Truncated Singular Value Decompostion

Project description

IncSVD: A python package for dynamic Truncated SVD

PyPI PyPI https://arxiv.org/pdf/2401.09703.pdf https://arxiv.org/pdf/2401.09703.pdf

IncSVD is a python package capable of dynamically maintaining the incremental Truncated Singular Value Decomposition (Truncated SVD) of evolving matrices.

This is the code implementation of the paper "Fast Updating Truncated SVD for Representation Learning with Sparse Matrices" (ICLR'24).

Install and Setup.

It is recommended to use python 3.8 or higher and install IncSVD with the following command.

pip install IncSVD

User Guide

Initialize the IncSVD using an $m \times n$ matrix data_init given the rank (dimension) k for which the truncated SVD needs to be computed. Here is an example to initialize a $1000 \times 1000$ matrix with truncated dimension $k = 64$.

class EvolvingMatrix(data, k, sparse = True, method = "RPI")

  • data: data matrix Data matrix used to initialize truncated SVD. The format of this initial matrix can be np.ndarray or scipy.sparse.
  • k: int Truncated dimension in the maintained SVD. It should be a positive integer not greater than the length of the sides of the original matrix.
  • sparse: bool, default=True Set true implies the update process will employ the proposed acceleration methods [1]. Set false implies using the original methods (ZhaSimon's [2], Vecharynski's [3], Yamazaki's [4]).
  • method: {'ZhaSimon', 'GKL', 'RPI'}, default='RPI' Method for SVD update. It can be chosen from ['ZhaSimon', 'GKL', 'RPI'].
import scipy.sparse
import IncSVD.EvolvingMatrix as EM
m, n, k = 1000, 1000, 64

# data_init = scipy.sparse.rand(m, n, density=0.005).tocsr()
M = EM.EvolvingMatrix(data_init, k, sparse=True, method="ZhaSimon")

[!TIP]

We recommend using GKL or RPI when the rank of each update matrix (e.g. adding many rows at once) is large.

Row / Column Update

Expand the original matrix in row or column dimensions and update the corresponding SVD results.

addrc

add_row(update_matrix, l=None, t=None)

  • update_matrix: data matrix Data matrix to be appended. This matrix should have the same number of columns as the original matrix. If type sparse is True, this matrix should be in scipy.sparse._csr.csr_matrix.
  • l: {int, None} Approximate dimension for the update matrix. When using GKL or RPI, two methods that require an approximation space, it is necessary to set l. The update matrix of rank s will be approximated with a smaller space of rank l for further speedup. Set to None when GKL and RPI is not applied.
  • t: {int, None} The number of iterations to perform in the random power iteration. It needs to be set when using the RPI. This approximation is more accurate when t is larger, while the computational overhead is larger. Set to None when RPI is not applied.

add_column(update_matrix, l=None, t=None)

  • update_matrix: data matrix Data matrix to be appended. This matrix should have the same number of rows as the original matrix. If type sparse is True, this matrix should be in scipy.sparse._csc.csc_matrix.
  • l: {int, None} Approximate dimension for the update matrix. Set to None when GKL and RPI is not applied.
  • t: {int, None} The number of iterations to perform in the random power iteration. Set to None when RPI is not applied.
''' Add rows '''
# data_append_row = scipy.sparse.rand(m, 10, density=0.005).tocsr() * 10
M.add_row(data_append_row)

''' Add columns '''
# data_append_col = scipy.sparse.rand(n, 10, density=0.005).tocsc() * 10
M.add_column(data_append_col)

[!NOTE]

If type sparse is True.

  • [Add row] The input data_append_row is required to have the same number of columns as the original matrix columns and be in scipy.sparse._csr.csr_matrix type.
  • [Add column] The input data_append_col is required to have the same number of rows as the original matrix rows and be in scipy.sparse._csc.csc_matrix type.

Low-rank Update.

Perform low-rank update $\overline{\mathbf{A}} = \mathbf{A} + \mathbf{D}_m \cdot \mathbf{E}_m^{\top}$, where $\mathbf{A}$ is the original matrix, $\mathbf{D}_m \in \mathbb{R}^{m \times s}$ and $\mathbf{E}_m \in \mathbb{R}^{n \times s}$.

weight

update_weight(self, update_matrix_B, update_matrix_C, l=None, t=None):

  • update_matrix_B: data matrix Data matrix to be appended. This matrix should have the same number of columns as the original matrix. If type sparse is True, this matrix should be in scipy.sparse._csc.csc_matrix.
  • update_matrix_C: data matrix Data matrix to be appended. This matrix should have the same number of columns as the original matrix. If type sparse is True, this matrix should be in scipy.sparse._csc.csc_matrix.
  • l: {int, None} Approximate dimension for the update matrix. Set to None when GKL and RPI is not applied.
  • t: {int, None} The number of iterations to perform in the random power iteration. Set to None when RPI is not applied.
''' Low-rank update '''
# data_dm = scipy.sparse.rand(m, 10, density=0.005).tocsc() * 10
# data_em = scipy.sparse.rand(n, 10, density=0.005).tocsc() * 10
M.update_weight(data_dm, data_em)

[!NOTE]

If type sparse is True, the input data_dm and data_em is required to be in scipy.sparse._csc.csc_matrix type.

Get the result.

Queries the current SVD result.

''' Get the result. '''
Uk, Sigmak, Vk = M.Uk, M.Sigmak, M.Vk

''' Get a row of singular vectors. '''
# i = 0
Uk0, Vk0 = M.Uki(i), M.Vki(i)

''' A low-rank approximation of data matrix can be obtained from. '''
Ak = Uk @ np.diag(Sigmak) @ Vk.T

Contact Us

If you have any questions about this code repository, feel free to raise an issue or email denghaoran@zju.edu.cn.

Citations.

If you find our work useful, please consider citing the following paper:

@inproceedings{deng2024incsvd,
title={Fast Updating Truncated {SVD} for Representation Learning with Sparse Matrices},
author={Haoran Deng and Yang Yang and Jiahe Li and Cheng Chen and Weihao Jiang and Shiliang Pu},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=CX2RgsS29V}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

IncSVD-0.2.6.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

IncSVD-0.2.6-py3-none-any.whl (10.4 kB view details)

Uploaded Python 3

File details

Details for the file IncSVD-0.2.6.tar.gz.

File metadata

  • Download URL: IncSVD-0.2.6.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.9

File hashes

Hashes for IncSVD-0.2.6.tar.gz
Algorithm Hash digest
SHA256 5b2fd0a5c26ef6739d1bd23f291301094ac2426d3e64eb0f40fbb3d20ba89216
MD5 60863b14a6c4e3308073c8d93acbb8fc
BLAKE2b-256 e7481819f6b9bba490a7ed5451ce9225c57fb6d6d3ca25dda5d3b0c49f00824f

See more details on using hashes here.

File details

Details for the file IncSVD-0.2.6-py3-none-any.whl.

File metadata

  • Download URL: IncSVD-0.2.6-py3-none-any.whl
  • Upload date:
  • Size: 10.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.9

File hashes

Hashes for IncSVD-0.2.6-py3-none-any.whl
Algorithm Hash digest
SHA256 d8a0d36cc328371864d715c00f97113e2c87acea56deb4d0594524f1b3600ad8
MD5 17e46b46443536b7e3ff7b3839e33666
BLAKE2b-256 e40e2d556d56676c2ab4062c2de7e3dc88d33860a96c8a0dfb7c4394b1013c66

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page