Consider the problem of solving systems of linear algebraic equations Ax = b with a real symmetric positive definite matrix A using the conjugate gradient (CG) method. To stop the algorithm at the appropriate moment, it is important to monitor the quality of the approximate solution.
One of the most relevant quantities for measuring the quality of the approximate solution is the A-norm of the error. This quantity cannot be easily computed; however, it can be estimated.
In this paper we discuss and analyze the behavior of theGauss-Radau upper bound on the A-norm of the error, based on viewing CG as a procedure for approximating a certain Riemann-Stieltjes integral. This upper bound depends on a prescribed underestimate mu to the smallest eigenvalue of A.
We concentrate on explaining a phenomenon observed during computations showing that, in later CG iterations, the upper bound loses its accuracy, and is almost independent of mu. We construct amodel problem that is used to demonstrate and study the behavior of the upper bound in dependence of mu, and developed formulas that are helpful in understanding this behavior.
We show that the above-mentioned phenomenon is closely related to the convergence of the smallest Ritz value to the smallest eigenvalue of A. It occurs when the smallest Ritz value is a better approximation to the smallest eigenvalue than the prescribed underestimate mu.
We also suggest an adaptive strategy for improving the accuracy of the upper bounds in the previous iterations.