The relative performance of the CSI-MSVD algorithm is compared with that of other commonly used eigenvalue methods [11] for computing the SVD of sparse matrices. As discussed in [3,22], matrix-vector multiplications are the fundamental and most expensive operations for iterative methods (CSI) and Krylov subspace methods (Lanczos, Arnoldi). Since the test matrices used are large and sparse (unstructured), memory requirements for any candidate scheme should also be minimized.

Another aspect of algorithm performance is the magnitude of the error in the singular triplet. When the SVD is obtained from the two-cyclic eigenvalue problem, the error in the singular triplet can be translated to the error in the eigenpair approximants obtained from CSI-MSVD [22]. Following Section 3.5, the error in CSI-MSVD can be approximated by the norm of

where is defined in Equation (32) and
is defined in Equation (37).

Michael W. Berry (berry@cs.utk.edu)

Sun May 19 11:34:27 EDT 1996