Sipser–Lautemann theorem

In computational complexity theory, the Sipser–Lautemann theorem or Sipser–Gács–Lautemann theorem states that bounded-error probabilistic polynomial (BPP) time is contained in the polynomial time hierarchy, and more specifically Σ2 ∩ Π2.

In 1983, Michael Sipser showed that BPP is contained in the polynomial time hierarchy. Péter Gács showed that BPP is actually contained in Σ2 ∩ Π2. Clemens Lautemann contributed by giving a simple proof of BPP’s membership in Σ2 ∩ Π2, also in 1983. It is conjectured that in fact BPP=P, which is a much stronger statement than the Sipser–Lautemann theorem.

Proof
Here we present the proof by Lautemann. Without loss of generality, a machine M ∈ BPP with error ≤ 2− can be chosen. (All BPP problems can be amplified to reduce the error probability exponentially.) The basic idea of the proof is to define a Σ2 sentence that is equivalent to stating that x is in the language, L, defined by M by using a set of transforms of the random variable inputs.

Since the output of M depends on random input, as well as the input x, it is useful to define which random strings produce the correct output as A(x) = {r | M(x,r) accepts}. The key to the proof is to note that when x ∈ L, A(x) is very large and when x ∉ L, A(x) is very small. By using bitwise parity, ⊕, a set of transforms can be defined as A(x) ⊕ t={r ⊕ t | r ∈ A(x)}. The first main lemma of the proof shows that the union of a small finite number of these transforms will contain the entire space of random input strings. Using this fact, a Σ2 sentence and a Π2 sentence can be generated that is true if and only if x ∈ L (see conclusion).

Lemma 1
The general idea of lemma one is to prove that if A(x) covers a large part of the random space $$R= \{ 1,0 \} ^ {|r|} $$ then there exists a small set of translations that will cover the entire random space. In more mathematical language:


 * If $$\frac{|A(x)|}{|R|} \ge 1 - \frac{1}{2^{|x|}}$$, then $$\exists t_1,t_2,\ldots,t_{|r|}$$, where $$t_i \in \{ 1,0 \} ^{|r|} $$ such that $$ \bigcup_i A(x) \oplus t_i = R.$$

Proof. Randomly pick t1, t2, ..., tundefined. Let $$S=\bigcup_i A(x)\oplus t_i$$ (the union of all transforms of A(x)).

So, for all r in R,
 * $$\Pr [r \notin S] = \Pr [r \notin A(x) \oplus t_1] \cdot \Pr [r \notin A(x) \oplus t_2] \cdots \Pr [r \notin A(x) \oplus t_{|r|}] \le { \frac{1}{2^{|x| \cdot |r|}} }.$$

The probability that there will exist at least one element in R not in S is


 * $$\Pr \Bigl[ \bigvee_i (r_i \notin S)\Bigr] \le \sum_i \frac{1}{2^{|x| \cdot |r|}} = \frac{2^{|r|}}{2^{|x| \cdot |r|}} < 1.$$

Therefore


 * $$\Pr [S = R] \ge 1 - \frac{2^{|r|}}{2^{|x| \cdot |r|}} > 0.$$

Thus there is a selection for each $$t_1,t_2,\ldots,t_{|r|}$$ such that


 * $$ \bigcup_i A(x) \oplus t_i = R.$$

Lemma 2
The previous lemma shows that A(x) can cover every possible point in the space using a small set of translations. Complementary to this, for x ∉ L only a small fraction of the space is covered by $$S=\bigcup_i A(x)\oplus t_i$$. We have:


 * $$\frac{|S|}{|R|} \le |r| \cdot \frac{|A(x)|}{|R|} \le |r| \cdot 2^{-|x|} < 1 $$

because $$|r|$$ is polynomial in $$|x|$$.

Conclusion
The lemmas show that language membership of a language in BPP can be expressed as a Σ2 expression, as follows.


 * $$x \in L \iff \exists t_1,t_2,\dots,t_{|r|}\, \forall r \in R \bigvee_{ 1 \le i \le |r|} (M(x, r \oplus t_i) \text{ accepts}).$$

That is, x is in language L if and only if there exist $$|r|$$ binary vectors, where for all random bit vectors r, TM M accepts at least one random vector ⊕ ti.

The above expression is in &Sigma;2 in that it is first existentially then universally quantified. Therefore BPP ⊆ Σ2. Because BPP is closed under complement, this proves BPP ⊆ Σ2 ∩ Π2.

Stronger version
The theorem can be strengthened to $$\mathsf{BPP} \subseteq \mathsf{MA} \subseteq \mathsf{S}_2^P \subseteq \Sigma_2 \cap \Pi_2$$ (see MA, S$P 2$).