next up previous contents index
Next: Error Bounds for Computed Up: Positive Definite Previous: Residual Vector.   Contents   Index


Transfer Residual Error to Backward Error.

It can be proved that there are Hermitian matrices $E$, e.g.,
\begin{displaymath}
E = -r\wtd x^*-\wtd x r^*
+\left(\wtd x^*A\wtd x-\wtd\lambda \wtd x^*B\wtd x \right)
\wtd x\wtd x^*,
\end{displaymath} (93)

such that $\wtd\lambda$ and $\wtd x$ are an exact eigenvalue and its corresponding eigenvector of $\{A+E,B\}$. We are interested in such matrices $E$ with smallest possible norms. It turns out the best possible $E=E_2$ for the spectral norm $\Vert\cdot\Vert _2$ and the best possible $E=E_{F}$ for the Frobenius norm $\Vert\cdot\Vert _{F}$ satisfy
\begin{displaymath}
\Vert E_2\Vert _2=\Vert r\Vert _2, \quad
\Vert E_{F}\Vert _{...
...Vert^2_2
- (\wtd x^* A\wtd x-\wtd\lambda\wtd x^* B\wtd x)^2}.
\end{displaymath} (94)

See [256,431,473]. In fact, $E_{\F}$ is given explicitly by (5.29). So if $\Vert r\Vert _2$ is small, the computed $\wtd\lambda$ and $\wtd x$ are exact ones of nearby matrices. Error analysis of this kind is called backward error analysis and matrices $E$ are backward errors.

We say an algorithm that delivers an approximate eigenpair $(\wtd\lambda,\wtd x)$ is $\tau $-backward stable for the pair with respect to the norm $\Vert\cdot\Vert$ if it is an exact eigenpair for $\{A+E,B\}$ with $\Vert E\Vert\le\tau$. With these in mind, statements can be made about the backward stability of the algorithm which computes the eigenpair $(\wtd\lambda,\wtd x)$. In convention, an algorithm is called backward stable if $\tau = O(\epsilon_M \Vert(A,B)\Vert)$, where $\epsilon_M$ is the machine precision.


next up previous contents index
Next: Error Bounds for Computed Up: Positive Definite Previous: Residual Vector.   Contents   Index
Susan Blackford 2000-11-20