Next: Local Reorthogonalization and Detecting Up: Reorthogonalization Previous: Full Reorthogonalization.   Contents   Index

#### Selective Reorthogonalization.

As the size of the basis grows, full reorthogonalization will need substantial arithmetic work, and we look for a more economical scheme. It can be proved that most of the desired properties of the Lanczos algorithm are preserved as long as the basis is semiorthogonal, i.e., orthogonal to half the machine precision,
 (32)

The reader is referred to the investigation by Simon [402] for a detailed discussion of these matters. It is proved that if the tridiagonal matrix is computed by a Lanczos algorithm with a semiorthogonal basis (4.18), then is a fully accurate projection of on the subspace spanned by the computed basis ,

Here is an orthogonal basis of the subspace spanned by . Even if is not known explicitly, we know that the eigenvalues of are accurate approximations to those of .

One way of making sure that the computed basis is semiorthogonal is to use a recurrence derived by Paige to monitor how losses of orthogonality in earlier steps are propagated to later steps and to do a reorthogonalization when this recurrence indicates that a threshold of is exceeded. The recurrence is between successive elements of the product matrix (4.18) and is derived from the basic recursion (4.10) in the following way.

First we note that the computed vector satisfies (4.10):

where is a vector of rounding errors. To get elements of the product matrix (4.18), we premultiply this with and get

This multiplication with indices and exchanged gives

and subtracting the two gives
 (33)

Note that the terms involving cancel out since for a Hermitian matrix .

We use this recursion to fill the lower triangular part of the symmetric matrix one row at a time. The diagonal elements and the elements closest above and below the diagonal at the rounding error level, due to symmetry and the explicit orthogonalization. This gives starting values for the recursion (4.19). We estimate the absolute value of the sum of the two last terms in (4.19) by , with an appropriate guess for the norm of , and feed these values, that also are at rounding error level, into the recursion with the sign chosen so that the absolute value of the new is maximized,

As soon as one element exceeds the sized threshold, we orthogonalize the two current vectors and to all the previous vectors , and put the corresponding and down at the rounding error level. We need to orthogonalize two vectors because of the three-term recurrence in the Lanczos method, and as a consequence in the recursion (4.19). It is not necessary to store more than the two latest rows, and , of the matrix of estimates.

The above is one of the techniques described in [402]. There is another method that uses explicit orthogonalization only to those vectors for which exceeds the threshold. This is what Simon [402] called partial reorthogonalization, but the variant described here, periodic reorthogonalization, which was first described by Grcar [204], is simpler to implement and can use Level 2 BLAS operations.

Next: Local Reorthogonalization and Detecting Up: Reorthogonalization Previous: Full Reorthogonalization.   Contents   Index
Susan Blackford 2000-11-20