Zyablov bound

In coding theory, the Zyablov bound is a lower bound on the rate $$r$$ and relative distance $$\delta$$ that are achievable by concatenated codes.

Statement of the bound
The bound states that there exists a family of $$q$$-ary (concatenated, linear) codes with rate $$r$$ and relative distance $$\delta$$ whenever

$$r \leqslant \max\limits_{0 \leqslant r' \leqslant 1 - H_q(\delta)} r' \cdot \left (1 - {\delta \over {H_q ^{-1}(1 - r')}} \right )$$,

where $$H_q$$is the $$q$$-ary entropy function

$$H_q(x) = x \log_q(q-1) - x \log_q(x) - (1 - x) \log_q(1 - x)$$.

Description
The bound is obtained by considering the range of parameters that are obtainable by concatenating a "good" outer code $$C_{out}$$ with a "good" inner code $$C_{in}$$. Specifically, we suppose that the outer code meets the Singleton bound, i.e. it has rate $$r_{out}$$and relative distance $$\delta_{out}$$ satisfying $$r_{out} + \delta_{out} = 1$$. Reed Solomon codes are a family of such codes that can be tuned to have any rate $$r_{out} \in (0,1)$$ and relative distance $$1 - r_{out}$$ (albeit over an alphabet as large as the codeword length). We suppose that the inner code meets the Gilbert–Varshamov bound, i.e. it has rate $$r_{in}$$and relative distance $$\delta_{in}$$ satisfying $$r_{in} + H_q(\delta_{in}) \ge 1$$. Random linear codes are known to satisfy this property with high probability, and an explicit linear code satisfying the property can be found by brute-force search (which requires time polynomial in the size of the message space).

The concatenation of $$C_{out}$$ and $$C_{in}$$, denoted $$C_{out} \circ C_{in}$$, has rate $$r = r_{in} \cdot r_{out}$$ and relative distance $$\delta = \delta_{out} \cdot \delta_{in} \ge (1 - r_{out}) \cdot H_q^{-1}(1 - r_{in}).$$

Expressing $$r_{out}$$ as a function of $$\delta, r_{in}$$,


 * $$r_{out} \ge 1- \frac{\delta}{H^{-1}(1-r_{in})}$$

Then optimizing over the choice of $$r_{in}$$, we see it is possible for the concatenated code to satisfy,


 * $$r \ge \max\limits_{0\le r_{in} \le {1- H_q(\delta)}} r_{in} \cdot \left ( 1 - {\delta \over {H_q ^{-1}(1 - r_{in})}} \right )$$

See Figure 1 for a plot of this bound.

Note that the Zyablov bound implies that for every $$\delta>0$$, there exists a (concatenated) code with positive rate and positive relative distance.

Remarks
We can construct a code that achieves the Zyablov bound in polynomial time. In particular, we can construct explicit asymptotically good code (over some alphabets) in polynomial time.

Linear Codes will help us complete the proof of the above statement since linear codes have polynomial representation. Let Cout be an $$[N, K]_{Q}$$ Reed–Solomon error correction code where $$N = Q-1$$ (evaluation points being $$\mathbb{F}_{Q}^* $$ with $$Q = q^k$$, then $$k = \theta(\log N)$$.

We need to construct the Inner code that lies on Gilbert-Varshamov bound. This can be done in two ways


 * 1) To perform an exhaustive search on all generator matrices until the required property is satisfied for $$C_{in}$$. This is because Varshamov's bound states that there exists a linear code that lies on Gilbert-Varshamon bound which will take $$q^{O(kn)}$$ time. Using $$k=rn$$ we get $$q^{O(kn)}=q^{O(k^{2})}=N^{O(\log N)}$$, which is upper bounded by $$nN^{O(\log nN)}$$, a quasi-polynomial time bound.
 * 2) To construct $$C_{in}$$ in $$q^{O(n)}$$ time and use $$(nN)^{O(1)}$$ time overall. This can be achieved by using the method of conditional expectation on the proof that random linear code lies on the bound with high probability.

Thus we can construct a code that achieves the Zyablov bound in polynomial time.

References and external links

 * MIT Lecture Notes on Essential Coding Theory – Dr. Madhu Sudan
 * University at Buffalo Lecture Notes on Coding Theory – Dr. Atri Rudra
 * University of Washington Lecture Notes on Coding Theory- Dr. Venkatesan Guruswami