In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms
Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, 2m = 2, or n = 3 and 2m = 4 a form is SOS if and only if it is positive.
[1] The same is also valid for the analog problem on positive symmetric forms.
[2][3] Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.
[4][5] Moreover, every real nonnegative form can be approximated as closely as desired (in the
-norm of its coefficient vector) by a sequence of forms
[6] To establish whether a form h(x) is SOS amounts to solving a convex optimization problem.
is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying
The dimension of the vector
whereas the dimension of the vector
Then, h(x) is SOS if and only if there exists a vector
meaning that the matrix
This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem.
was introduced in [7] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI.
This representation is also known as Gram matrix.
[8] A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms
To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem.
Indeed, similarly to the scalar case any F(x) can be written according to the SMR as
is the Kronecker product of matrices, H is any symmetric matrix satisfying
is a linear parameterization of the linear space
The dimension of the vector
Then, F(x) is SOS if and only if there exists a vector
such that the following LMI holds:
was introduced in [9] in order to establish whether a matrix form is SOS via an LMI.
Consider the free algebra R⟨X⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn.
By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form f = fT.
When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.
A noncommutative polynomial is SOS if there exists noncommutative polynomials
Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.
[10] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.