Gram matrix

In linear algebra, the Gram matrix (or Gramian matrix, Gramian) of a set of vectors

in the general case that the vector coordinates are complex numbers, which simplifies to

for the case that the vector coordinates are real numbers.

An important application is to compute linear independence: a set of vectors are linearly independent if and only if the Gram determinant (the determinant of the Gram matrix) is non-zero.

It is named after Jørgen Pedersen Gram.

with the usual Euclidean dot product, the Gram matrix is

on a finite-dimensional vector space over any field we can define a Gram matrix

The matrix will be symmetric if the bilinear form

The Gram matrix is symmetric in the case the inner product is real-valued; it is Hermitian in the general, complex case by definition of an inner product.

The Gram matrix is positive semidefinite, and every positive semidefinite matrix is the Gramian matrix for some set of vectors.

The fact that the Gramian matrix is positive-semidefinite can be seen from the following simple derivation: The first equality follows from the definition of matrix multiplication, the second and third from the bi-linearity of the inner-product, and the last from the positive definiteness of the inner product.

Note that this also shows that the Gramian matrix is positive definite if and only if the vectors

Various ways to obtain such a decomposition include computing the Cholesky decomposition or taking the non-negative square root of

is positive semidefinite if and only if it is the Gram matrix of some vectors

The infinite-dimensional analog of this statement is Mercer's theorem.

(any orthogonal transformation, that is, any Euclidean isometry preserving 0) to the sequence of vectors results in the same Gram matrix.

This is the only way in which two real vector realizations of

The same holds in the complex case, with unitary transformations in place of orthogonal ones.

is equal to the Gram matrix of vectors

then it is the square of the n-dimensional volume of the parallelotope formed by the vectors.

In particular, the vectors are linearly independent if and only if the parallelotope has nonzero n-dimensional volume, if and only if Gram determinant is nonzero, if and only if the Gram matrix is nonsingular.

When n = m, this reduces to the standard theorem that the absolute value of the determinant of n n-dimensional vectors is the n-dimensional volume.

The Gram determinant can also be expressed in terms of the exterior product of vectors by The Gram determinant therefore supplies an inner product for the space ⁠

⁠ is given, the vectors will constitute an orthonormal basis of n-dimensional volumes on the space ⁠

amounts to an n-dimensional Pythagorean Theorem for the volume of the parallelotope formed by the vectors

in terms of its projections onto the basis volumes

[citation needed] Note that in the common case that n = m, the second term on the right-hand side will be zero.

Given a set of linearly independent vectors

, one can construct an orthonormal basis In matrix notation,

is positive definite, which implies that the diagonal entries of