Differential calculus

[1] It is one of the two traditional divisions of calculus, the other being integral calculus—the study of the area beneath a curve.

The derivative of the momentum of a body with respect to time equals the force applied to the body; rearranging this derivative statement leads to the famous F = ma equation associated with Newton's second law of motion.

In operations research, derivatives determine the most efficient ways to transport materials and design factories.

[3] In order to gain an intuition for this, one must first be familiar with finding the slope of a linear equation, written in the form

Instead, the slope of the graph can be computed by considering the tangent line—a line that 'just touches' a particular point.

If x and y are vectors, then the best linear approximation to the graph of f depends on how f changes in several directions at once.

Taking the best linear approximation in a single direction determines a partial derivative, which is usually denoted ⁠∂y/∂x⁠.

[5] Archimedes also made use of indivisibles, although these were primarily used to study areas and volumes rather than derivatives and tangents (see The Method of Mechanical Theorems).

The use of infinitesimals to compute rates of change was developed significantly by Bhāskara II (1114–1185); indeed, it has been argued[6] that many of the key notions of differential calculus can be found in his work, such as "Rolle's theorem".

Rashed's conclusion has been contested by other scholars, however, who argue that he could have obtained the result by other methods which do not require the derivative of the function to be known.

[8][page needed] The modern development of calculus is usually credited to Isaac Newton (1643–1727) and Gottfried Wilhelm Leibniz (1646–1716), who provided independent[e] and unified approaches to differentiation and derivatives.

The key insight, however, that earned them this credit, was the fundamental theorem of calculus relating differentiation and integration: this rendered obsolete most previous methods for computing areas and volumes.

[f] For their ideas on derivatives, both Newton and Leibniz built on significant earlier work by mathematicians such as Pierre de Fermat (1607-1665), Isaac Barrow (1630–1677), René Descartes (1596–1650), Christiaan Huygens (1629–1695), Blaise Pascal (1623–1662) and John Wallis (1616–1703).

Regarding Fermat's influence, Newton once wrote in a letter that "I had the hint of this method [of fluxions] from Fermat's way of drawing tangents, and by applying it to abstract equations, directly and invertedly, I made it general.

"[9] Isaac Barrow is generally given credit for the early development of the derivative.

[10] Nevertheless, Newton and Leibniz remain key figures in the history of differentiation, not least because Newton was the first to apply differentiation to theoretical physics, while Leibniz systematically developed much of the notation still used today.

In the 19th century, calculus was put on a much more rigorous footing by mathematicians such as Augustin Louis Cauchy (1789–1857), Bernhard Riemann (1826–1866), and Karl Weierstrass (1815–1897).

It was also during this period that the differentiation was generalized to Euclidean space and the complex plane.

Later the theory of distributions (after Laurent Schwartz) extended derivation to generalized functions (e.g., the Dirac delta function previously introduced in Quantum Mechanics) and became fundamental to nowadays applied analysis especially by the use of weak solutions to partial differential equations.

An alternative approach, called the first derivative test, involves considering the sign of the f' on each side of the critical point.

Taking derivatives and solving for critical points is therefore often a simple way to find local minima or maxima, which can be useful in optimization.

If the function is differentiable, the minima and maxima can only occur at critical points or endpoints.

This also has applications in graph sketching: once the local minima and maxima of a differentiable function have been found, a rough plot of the graph can be obtained from the observation that it will be either increasing or decreasing between critical points.

Another example is: Find the smallest area surface filling in a closed curve in space.

For example, Newton's second law, which describes the relationship between acceleration and force, can be stated as the ordinary differential equation The heat equation in one space variable, which describes how heat diffuses through a straight rod, is the partial differential equation Here u(x,t) is the temperature of the rod at position x and time t and α is a constant that depends on how fast heat diffuses through the rod.

In other words, In practice, what the mean value theorem does is control a function in terms of its derivative.

More complicated conditions on the derivative lead to less precise but still highly useful information about the original function.

Taylor's theorem gives a precise bound on how good the approximation is.

It states that if f is continuously differentiable, then around most points, the zero set of f looks like graphs of functions pasted together.

The points where this is not true are determined by a condition on the derivative of f. The circle, for instance, can be pasted together from the graphs of the two functions ± √1 - x2.

The graph of a function, drawn in black, and a tangent line to that function, drawn in red. The slope of the tangent line equals the derivative of the function at the marked point.
The graph of an arbitrary function . The orange line is tangent to , meaning at that exact point, the slope of the curve and the straight line are the same.
The derivative at different points of a differentiable function
The graph of
The graph of , with a straight line that is tangent to . The slope of the tangent line is equal to . (The axes of the graph do not use a 1:1 scale.)
The dotted line goes through the points and , which both lie on the curve . Because these two points are fairly close together, the dotted line and tangent line have a similar slope. As the two points become closer together, the error produced by the secant line becomes vanishingly small.
The mean value theorem: For each differentiable function with there is a with .