Order of accuracy

In numerical analysis, order of accuracy quantifies the rate of convergence of a numerical approximation of a differential equation to the exact solution.

, the exact solution to a differential equation in an appropriate normed space

Consider a numerical approximation

is a parameter characterizing the approximation, such as the step size in a finite difference scheme or the diameter of the cells in a finite element method.

The numerical solution

th-order accurate if the error

is proportional to the step-size

th power:[1] where the constant

[2] Using the big O notation an

th-order accurate numerical method is notated as This definition is strictly dependent on the norm used in the space; the choice of such norm is fundamental to estimate the rate of convergence and, in general, all numerical errors correctly.

The size of the error of a first-order accurate approximation is directly proportional to

Partial differential equations which vary over both time and space are said to be accurate to order

This applied mathematics–related article is a stub.

You can help Wikipedia by expanding it.