Cauchy–Schwarz inequality

It is considered one of the most important and widely used inequalities in mathematics.

The inequality for sums was published by Augustin-Louis Cauchy (1821).

The corresponding inequality for integrals was published by Viktor Bunyakovsky (1859)[2] and Hermann Schwarz (1888).

Schwarz gave the modern proof of the integral version.

is always a non-negative real number (even if the inner product is complex-valued).

By taking the square root of both sides of the above inequality, the Cauchy–Schwarz inequality can be written in its more familiar form in terms of the norm:[6][7] Moreover, the two sides are equal if and only if

[8][9][10] Sedrakyan's inequality, also known as Bergström's inequality, Engel's form, Titu's lemma (or the T2 lemma), states that for real numbers

It is a direct consequence of the Cauchy–Schwarz inequality, obtained by using the dot product on

This form is especially helpful when the inequality involves fractions where the numerator is a perfect square.

The form above is perhaps the easiest in which to understand the inequality, since the square of the cosine can be at most 1, which occurs when the vectors are in the same or opposite directions.

It can also be restated in terms of the vector coordinates

The Cauchy–Schwarz inequality can be proved using only elementary algebra in this case by observing that the difference of the right and the left hand side is

Since the latter polynomial is nonnegative, it has at most one real root, hence its discriminant is less than or equal to zero.

where the bar notation is used for complex conjugation), then the inequality may be restated more explicitly as follows:

For the inner product space of square-integrable complex-valued functions, the following inequality holds.

Taking square roots gives the triangle inequality:

[11][12] The Cauchy–Schwarz inequality allows one to extend the notion of "angle between two vectors" to any real inner-product space by defining:[13][14]

The Cauchy–Schwarz inequality proves that this definition is sensible, by showing that the right-hand side lies in the interval [−1, 1] and justifies the notion that (real) Hilbert spaces are simply generalizations of the Euclidean space.

It can also be used to define an angle in complex inner-product spaces, by taking the absolute value or the real part of the right-hand side,[15][16] as is done when extracting a metric from quantum fidelity.

be arbitrary vectors in an inner product space over the scalar field

It also includes the easy part of the proof of the Equality Characterization given above; that is, it proves that if

), so the above computation shows that the Cauchy–Schwarz inequality holds in this case.

Consequently, the Cauchy–Schwarz inequality only needs to be proven only for non-zero vectors and also only the non-trivial direction of the Equality Characterization must be shown.

then establishes a relation of linear dependence between

The converse was proved at the beginning of this section, so the proof is complete.

Further generalizations are in the context of operator theory, e.g. for operator-convex functions and operator algebras, where the domain and/or range are replaced by a C*-algebra or W*-algebra.

An inner product can be used to define a positive linear functional.

being a finite measure, the standard inner product gives rise to a positive functional

is a unital positive map, then for every normal element

[29] There are also non-commutative versions for operators and tensor products of matrices.

Cauchy–Schwarz inequality in a unit circle of the Euclidean plane