A simple variant of
a PCG method for the pencil
can be written as
For the locally optimal version of the PCG method, we can trivially apply convergence rate estimate (11.13) as the method converges not slower than the preconditioned steepest ascent in terms of maximizing the Rayleigh quotient.
We have collected the dominant costs per iteration for Algorithm 11.10 in terms of storage and floating point operations, respectively, in the following tables.
Item | Storage |
Residual | -vector |
Approximate eigenvector | -vectors |
Temporary vectors | 3-5 -vectors |
Action | Major cost |
Rayleigh-Ritz method | dots and matrix-vector products, |
Residual | matrix-vector products, update |
Preconditioned residual | Depends on preconditioner |
Approximate eigenvector | update |
The conjugate gradient methods for eigenvalue problems were suggested by Bradbury and Fletcher [61] and have attracted attention recently; different versions have been suggested, e.g., [433,394,429,186,464,185,167,48]. They are usually constructed as general conjugate gradient methods, applied to minimization of the Rayleigh quotient, and do not utilize the specific property of the Rayleigh quotient that local minimization problems can be easily solved using the Rayleigh-Ritz method. They are typically based on (now standard for linear systems) two linked two-term recurrences, where one of the scalar parameters is chosen to make directions conjugate in some sense. The peculiarity of Algorithm 11.10 is that it uses a three-term recurrence, which allows us both to choose scalar parameters by solving a local minimization problem and to not worry about finding conjugate directions.
We finally note that Algorithm 11.10 is somewhat unstable since the basis of the trial subspace is almost linearly dependent. This can be cured by an orthogonalization prior to applying the Rayleigh-Ritz method.
We present a block version of Algorithm 11.10 in the next subsection.