Schur decomposition

Since U is similar to A, it has the same spectrum, and since it is triangular, its eigenvalues are the diagonal entries of U.

The Schur decomposition implies that there exists a nested sequence of A-invariant subspaces {0} = V0 ⊂ V1 ⊂ ⋯ ⊂ Vn = Cn, and that there exists an ordered orthonormal basis (for the standard Hermitian form of Cn) such that the first i basis vectors span Vi for each i occurring in the nested sequence.

Phrased somewhat differently, the first part says that a linear operator J on a complex finite-dimensional vector space stabilizes a complete flag (V1, ..., Vn).

where Q is an orthogonal matrix and H is either upper or lower quasi-triangular.

Just as in the complex case, a family of commuting real matrices {Ai} may be simultaneously brought to quasi-triangular form by an orthogonal matrix.

There exists an orthogonal matrix Q such that, for every Ai in the given family,

A constructive proof for the Schur decomposition is as follows: every operator A on a complex finite-dimensional vector space has an eigenvalue λ, corresponding to some eigenspace Vλ.

But exactly the same procedure can be applied to the sub-matrix A22, viewed as an operator on Vλ⊥, and its submatrices.

Continue this way until the resulting matrix is upper triangular.

Since each conjugation increases the dimension of the upper-triangular block by at least one, this process takes at most n steps.

Thus the space Cn will be exhausted and the procedure has yielded the desired result.

[5] The above argument can be slightly restated as follows: let λ be an eigenvalue of A, corresponding to some eigenspace Vλ.

Notice the preimage of Wμ under the quotient map is an invariant subspace of A that contains Vλ.

Then the successive preimages of the eigenspaces found at each step form a flag that A stabilizes.

Therefore, all matrices in {Ai} must share one common eigenvector in VA.

As a corollary, we have that every commuting family of normal matrices can be simultaneously diagonalized.

In the infinite dimensional setting, not every bounded operator on a Banach space has an invariant subspace.

However, the upper-triangularization of an arbitrary square matrix does generalize to compact operators.

Every compact operator on a complex Banach space has a nest of closed invariant subspaces.

The Schur decomposition of a given matrix is numerically computed by the QR algorithm or its variants.

In other words, the roots of the characteristic polynomial corresponding to the matrix are not necessarily computed ahead in order to obtain its Schur decomposition.

Conversely, the QR algorithm can be used to compute the roots of any given characteristic polynomial by finding the Schur decomposition of its companion matrix.

Similarly, the QR algorithm is used to compute the eigenvalues of any given matrix, which are the diagonal entries of the upper triangular matrix of the Schur decomposition.

Although the QR algorithm is formally an infinite sequence of operations, convergence to machine precision is practically achieved in

[7] See the Nonsymmetric Eigenproblems section in LAPACK Users' Guide.

(where x is an unknown nonzero vector) can be calculated as the ratio of the diagonal elements of S to those of T. That is, using subscripts to denote matrix elements, the ith generalized eigenvalue