next up previous
Next: Related Eigenvalue Problems Up: Introduction Previous: Introduction

1.1 Singular Value Decomposition

Given an matrix A, where and rank(A) = r, the singular value decomposition of A, denoted by SVD(A), is defined as


where , , and . The first r columns of the orthogonal matrices U and V define the orthonormal eigenvectors associated with the r nonzero eigenvalues of and , respectively. U and V are referred to as the left and right singular vectors, respectively. The singular values of A are defined as the diagonal elements of which are the non-negative square roots of the n eigenvalues of . A discussion of the properties and applications of the SVD can be found elsewhere [11,21].

The SVD can reveal important information about the structure of a matrix as illustrated by the following well-known theorem [3].


The rank property illustrates how the singular values of A can be used as quantitative measures of the qualitative notion of rank. The dyadic decomposition, which is the rationale for data reduction or compression in many scientific applications, provides a canonical description of a matrix as a sum of r rank-one matrices of decreasing importance, as measured by the singular values.

Michael W. Berry (
Sun May 19 11:34:27 EDT 1996