Error Analysis Definition/Meaning:
A term that when applied to numerical analysis refers to the
mathematical analysis that describes the various aspects of error behavior in
numerical methods (or algorithms). Convergence of an algorithm is a fundamental
requirement. Most algorithms result in the construction of a sequence of
approximations. If this sequence tends more and more closely to the true
solution of the problem, the algorithm is convergent. How fast the algorithm
converges is important for its efficiency; some insight is provided by the
'order of the method. Since most algorithms are terminated before convergence is
reached, the size of the error after a finite number of steps must be estimated.
How big the error is at most can be determined from an error bound. This must be
reasonably "sharp", i.e. it must not grossly overestimate the error. How big the
error is approximately is referred to as an error estimate and is usually
determined from an asymptotic formula. Such estimates are widely used in step-by-step
methods for ordinary differential equations; here the stepsize, h, must be
small enough for the estimate to be accurate.
In numerical linear algebra
backward error analysis has proved very successful in analyzing errors. In this
approach it is shown that the numerical solution satisfies exactly a perturbed
form of the original problem. Bounds for the perturbations are determined and
these can be inserted into standard results, thus producing a bound for the
error in the numerical solution. The approach can be applied to other areas.