In coding theory, the Zyablov bound is a lower bound on the rate
and relative distance
that are achievable by concatenated codes.
The bound states that there exists a family of
-ary (concatenated, linear) codes with rate
and relative distance
-ary entropy function
.The bound is obtained by considering the range of parameters that are obtainable by concatenating a "good" outer code
with a "good" inner code
Specifically, we suppose that the outer code meets the Singleton bound, i.e. it has rate
and relative distance
Reed Solomon codes are a family of such codes that can be tuned to have any rate
and relative distance
(albeit over an alphabet as large as the codeword length).
We suppose that the inner code meets the Gilbert–Varshamov bound, i.e. it has rate
and relative distance
Random linear codes are known to satisfy this property with high probability, and an explicit linear code satisfying the property can be found by brute-force search (which requires time polynomial in the size of the message space).
and relative distance
, Then optimizing over the choice of
, we see it is possible for the concatenated code to satisfy, See Figure 1 for a plot of this bound.
Note that the Zyablov bound implies that for every
, there exists a (concatenated) code with positive rate and positive relative distance.
We can construct a code that achieves the Zyablov bound in polynomial time.
In particular, we can construct explicit asymptotically good code (over some alphabets) in polynomial time.
Linear Codes will help us complete the proof of the above statement since linear codes have polynomial representation.
Reed–Solomon error correction code where
(evaluation points being
k = θ ( log
We need to construct the Inner code that lies on Gilbert-Varshamov bound.
This can be done in two ways Thus we can construct a code that achieves the Zyablov bound in polynomial time.