[1] Limits of functions are essential to calculus and mathematical analysis, and are used to define continuity, derivatives, and integrals.
This means that the value of the function f can be made arbitrarily close to L, by choosing x sufficiently close to c. Alternatively, the fact that a function f approaches the limit L as x approaches c is sometimes denoted by a right arrow (→ or
"[2][3] Grégoire de Saint-Vincent gave the first definition of limit (terminus) of a geometric series in his work Opus Geometricum (1647): "The terminus of a progression is the end of the series, which none progression can reach, even not if she is continued in infinity, but which she can approach nearer than a given segment.
"[4] The modern definition of a limit goes back to Bernard Bolzano who, in 1817, developed the basics of the epsilon-delta technique to define continuous functions.
However, his work remained unknown to other mathematicians until thirty years after his death.
The modern notation of placing the arrow below the limit symbol is due to G. H. Hardy, who introduced it in his book A Course of Pure Mathematics in 1908.
This sequence can be rigorously shown to have the limit 1, and therefore this expression is meaningfully interpreted as having the value 1.
is read as: The formal definition intuitively means that eventually, all elements of the sequence get arbitrarily close to the limit, since the absolute value |an − L| is the distance between an and L. Not every sequence has a limit.
On the other hand, if X is the domain of a function f(x) and if the limit as n approaches infinity of f(xn) is L for every arbitrary sequence of points {xn} in X − x0 which converges to x0, then the limit of the function f(x) as x approaches x0 is equal to L.[10] One such sequence would be {x0 + 1/n}.
is called unbounded, a definition equally valid for sequences in the complex numbers, or in any metric space.
For example, it is possible to construct a sequence of continuous functions which has a discontinuous pointwise limit.
Many different notions of convergence can be defined on function spaces.
Intuitively speaking, the expression means that f(x) can be made to be as close to L as desired, by making x sufficiently close to c.[11] In that case, the above equation can be read as "the limit of f of x, as x approaches c, is L".
from the set of points under consideration, but some authors do not include this in their definition of limits, replacing
A two-sided infinite limit can be defined, but an author would explicitly write
This direct definition is easier to extend to one-sided infinite limits.
In non-standard analysis (which involves a hyperreal enlargement of the number system), the limit of a sequence
of the natural extension of the sequence at an infinite hypernatural index n=H.
A use of this notion is to characterize the "long-term behavior" of oscillatory sequences.
This notion is used in dynamical systems, to study limits of trajectories.
The corresponding limit set for sequences of decreasing time is called the
Limits are used to define a number of important concepts in analysis.
A particular expression of interest which is formalized as the limit of a sequence is sums of infinite series.
A surprising result for conditionally convergent series is the Riemann series theorem: depending on the ordering, the partial sums can be made to converge to any real number, as well as
In the most general setting of topological spaces, a short proof is given below: Let
In real analysis, for the more concrete case of real-valued functions defined on a subset
In general metric spaces, it continues to hold that convergent sequences are also Cauchy.
But the converse is not true: not every Cauchy sequence is convergent in a general metric space.
There exist limit expressions whose modulus of convergence is undecidable.
Examples include the ratio test and the squeeze theorem.