Polynomial matrices are widely studied in the fields of systems theory and control theory and have seen other uses relating to stable polynomials.
In stability theory, Spectral Factorization has been used to find determinantal matrix representations for bivariate stable polynomials and real zero polynomials.
This result was originally proven by Norbert Wiener in a more general context which was concerned with integrable matrix-valued functions that also had integrable log determinant.
[2] Because applications are often concerned with the polynomial restriction, simpler proofs and individual analysis exist focusing on this case.
[3] Weaker positivstellensatz conditions have been studied, specifically considering when the polynomial matrix has positive definite image on semi-algebraic subsets of the reals.
[4] Many publications recently have focused on streamlining proofs for these related results.
[5][6] This article roughly follows the recent proof method of Lasha Ephremidze[7] which relies only on elementary linear algebra and complex analysis.
[8] Some modern algorithms focus on the more general setting originally studied by Wiener while others have used Toeplitz matrix advances to speed up factor calculations.
has no poles or zeroes in the lower half plane.
This decomposition is unique up to multiplication by complex scalars of norm
The numerator and denominator have distinct sets of roots, so all real roots which show up in either must have even multiplicity (to prevent a sign change locally).
We can divide out these real roots to reduce to the case where
For each conjugate pair, pick the zero or pole in the upper half plane and accumulate these to obtain
The inspiration for this result is a factorization which characterizes positive definite matrices.
If we don't restrict to lower triangular matrices we can consider all factorizations of the form
To obtain the lower triangular decomposition we induct by splitting off the first row and first column:
is a positive real number, so it has a square root.
The last condition from induction since the right hand side is the Schur complement of
, then by the symmetric Gaussian elimination we performed above, all we need to show is there exists a rational
Since the Schur complement is positive definite for the real
away from the poles and the Schur complement is a rational polynomial matrix we can induct to find
is a rational polynomial matrix with no poles in the lower half plane.
One way to prove the existence of polynomial matrix spectral factorization is to apply the Cholesky decomposition to a rational polynomial matrix and modify it to remove lower half plane singularities.
with no lower half plane poles exists such that
has determinant with one less zero (by multiplicity) at a, without introducing any poles in the lower half plane of any of the entries.
is holomorphic and invertible on the lower half plane.
To extend analyticity to the upper half plane we need this key observation: If an invertible rational matrix
Given two polynomial matrix decompositions which are invertible on the lower half plane
is analytic on the lower half plane and nonsingular,
is a rational polynomial matrix which is analytic and invertible on the lower half plane.