TransWikia.com

Implementation of sparse matrix SVD for GPU

Computational Science Asked on September 5, 2021

I have a sparse matrix $W$ which is almost-squared ($N+1 times N$) and I would like to know the eigenvalues of $A = W^T W$. $A$ is Hermitian so the eigenvalues are real-positive valued.

The usual approach would be to do svd(W), but I found no GPU SVD sparse implementation. I’m working in python but I’m good with any language hoping I can found a C/ C++ code to wrap and call.

I’ve looked into cuSPARSE and cuSOLVE and I only found:

  • eigenvalue solver
  • LU
  • QR
  • Cholesky

$W$ is an $N+1 times N$ complex sparse matrix with sparsity = $1 – 2^{-M}$ for $M$ in $[9,10,11]$

I’ve tried using CPU libraries (numpy and scipy), but they’re very slow since the fraction of nonzero SVD is more than 20% for $M = 9$. I’ve looked into the randomized solver implemented by scikit-learn but I cannot use it since this method is not proven to work on complex matrices.

Any tip is more than welcome.

One Answer

I'm no expert on software and certainly not on GPU software, but I can hopefully give some advice of a mathematical nature that might be helpful to you.

Given a matrix $W$, one can embed $W$ in the larger square and Hermitian matrix

$$ B = begin{bmatrix} 0 & W W^* & 0 end{bmatrix}. $$

$B$ is a square matrix with twice the number of nonzeros as $W$. The nonzero eigenvalues of $B$ are $pm$ the nonzero singular values of $W$. If $sigma$ is a singular value of $W$ satisfying $Wv = sigma u$ and $W^*u = sigma v$, then

$$ B begin{bmatrix} u v end{bmatrix} = begin{bmatrix} 0 & W W^* & 0 end{bmatrix}begin{bmatrix} u v end{bmatrix} = begin{bmatrix} Wv W^* u end{bmatrix} = sigma begin{bmatrix} u v end{bmatrix}. $$

$begin{bmatrix} u v end{bmatrix}$ is an eigenvector of $B$ with eigenvalue $sigma$. Likewise, $begin{bmatrix} u -v end{bmatrix}$ is an eigenvector of $B$ with eigenvector $-sigma$. This should allow you to use any Hermitian eigenvalue code to get a (full or truncated) SVD. This embedding might not be as fast as a dedicated SVD code, but it will allow you to take advantage of existing highly optimized eigenvalue code to compute an SVD.

Also, regarding randomized methods, every method I'm aware of works for both real and complex matrices. Software support for complex values might be a different story.

Answered by eepperly16 on September 5, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP