In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots.
The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any basis (that is, the characteristic polynomial does not depend on the choice of a basis).
In spectral graph theory, the characteristic polynomial of a graph is the characteristic polynomial of its adjacency matrix.
[4] In linear algebra, eigenvalues and eigenvectors play a fundamental role, since, given a linear transformation, an eigenvector is a vector whose direction is not changed by the transformation, and the corresponding eigenvalue is the measure of the resulting change of magnitude of the vector.
More precisely, suppose the transformation is represented by a square matrix
so it makes no difference for properties like having as roots the eigenvalues of
matrix is monic (its leading coefficient is
The most important fact about the characteristic polynomial was already mentioned in the motivational paragraph: the eigenvalues of
Using the language of exterior algebra, the characteristic polynomial of an
This trace may be computed as the sum of all principal minors of
The recursive Faddeev–LeVerrier algorithm computes these coefficients more efficiently.
each such trace may alternatively be computed as a single determinant, that of the
Informally speaking, every matrix satisfies its own characteristic equation.
The converse however is not true in general: two matrices with the same characteristic polynomial need not be similar.
is similar to a matrix in Jordan normal form.
are singular, the desired identity is an equality between polynomials in
Thus, to prove this equality, it suffices to prove that it is verified on a non-empty open subset (for the usual topology, or, more generally, for the Zariski topology) of the space of all the coefficients.
The result follows from the case of square matrices, by comparing the characteristic polynomials of
The multiplicities can be shown to agree as well, and this generalizes to any polynomial in place of
The theorem applies to matrices and polynomials over any field or commutative ring.
has a factorization into linear factors is not always true, unless the matrix is over an algebraically closed field such as the complex numbers.
This proof only applies to matrices and polynomials over complex numbers (or any algebraically closed field).
In that case, the characteristic polynomial of any square matrix can be always factorized as
Moreover, the Jordan decomposition theorem guarantees that any square matrix
on the diagonal (with each eigenvalue repeated according to its algebraic multiplicity).
(The Jordan normal form has stronger properties, but these are sufficient; alternatively the Schur decomposition can be used, which is less popular but somewhat easier to prove).
The term secular function has been used for what is now called characteristic polynomial (in some literature the term secular function is still used).
The term comes from the fact that the characteristic polynomial was used to calculate secular perturbations (on a time scale of a century, that is, slow compared to annual motion) of planetary orbits, according to Lagrange's theory of oscillations.
Garibaldi (2004) defines the characteristic polynomial for elements of an arbitrary finite-dimensional (associative, but not necessarily commutative) algebra over a field
and proves the standard properties of the characteristic polynomial in this generality.