User:Daverobe/temp

In computational complexity theory, the Sipser-Lautemann theorem or Sipser-Gaćs-Lautemann theorem states that BPP (bounded-error probabilistic polynomial) time, is contained in the polynomial time hierarchy, and more specifically &Sigma;2 &cap; &Pi;2.

In 1983, Michael Sipser showed that BPP is contained in the polynomial time hierarchy. Peter Gaćs showed that BPP is actually contained in &Sigma;2 &cap; &Pi;2. Clemens Lautemann contributed by giving a simple proof of BPP’s membership in &Sigma;2 &cap; &Pi;2, also in 1983.

Proof
Michael Sipser's version of the proof works as follows. Without loss of generality, a machine M &sube; BPP with error &le; 2- can be chosen. (All BPP problems can be amplified to reduce the error probability exponentially.) The basic idea of the proof is to define a &Sigma;2 &cap; &Pi;2 sentence that is equivalent to stating that x is in the language, L, defined by M by using a set of transforms of the random variable inputs.

Since the output of M depends on random input, as well as the input x, it is useful to define which random strings produce the correct output as A(x) = {r | M(x,r) accepts}. The key to the proof is to note that when x &isin; L, A(x) is very large and when x &notin; L, A(x) is very small. By using bitwise parity, &oplus;, a set of transforms can be defined as A(x) &oplus; t={r &oplus; t | r &isin; A(x)}. The first main lemma of the proof shows that the union of a small finite number of these transforms will contain the entire space of random input strings. Using this fact, a &Sigma;2 sentence and a &Pi;2 sentence can be generated that is true if and only if x&isin;L (see corollary).

Lemma 1
The general idea of lemma one is to prove that if A(x) covers a large part of the random space $$R= \{ 1,0 \} ^ {|r|} $$ then there exists a small set of translations that will cover the entire random space. In more mathematical language:

If $$\frac{|A(x)|}{|R|} > 1 - \frac{1}{2^{|x|}}$$, then $$\exists t_1,t_2,\ldots,t_{|r|}$$, where $$t_i \in \{ 1,0 \} ^{|r|} \ $$ such that $$ \cup_i A(x) \oplus t_i = R \ $$

Proof. Randomly pick t1,t2,...,tundefined. Let S= &cup;i A(x) &oplus; ti (the union of all transforms of A(x)).

So, $$\forall r \in R$$: $$\Pr [r \notin S] = \Pr [r \notin A(x) \oplus t_1] \cdot \Pr [r \notin A(x) \oplus t_2] \cdots \Pr [r \notin A(x) \oplus t_{|r|}] \le { \frac{1}{2^{|x| \cdot |r|}} } $$

The probability that there will exist at least one element in R not in S is:

$$\Pr [ \bigvee_i (r_i \notin S)] \le \sum_r \frac{1}{2^{|x| \cdot |r|}} = \frac{1}{2^{|x|}} < 1$$

Therefore $$\Pr [S = R] \ge 1 - \frac{1}{2^{|x|}}$$. Thus there is a selection for each $$t_1,t_2,\ldots,t_{|r|}$$ such that $$ \cup_i A(x) \oplus t_i = R \ $$.

Lemma 2
The previous lemma shows that A(x) can cover every possible point in the space using a small set of translations. Complementary to this, for x &notin; L only a small fraction of the space is covered by A(x). Therefore the set of random strings causing M(x,r) to accept cannot be generated by a small set of vectors ti.

$$R = \cup_i A(x) \oplus t_i$$

R is the set of all accepting random strings, exclusive-or'd with vectors ti.

$$\frac{|A(x)|}{|R|} \le \frac{1}{2^{|k|}} \implies \neg \exists t_1,t_2,...,t_{r}$$

Corollary
An important corollary of the lemmas shows that the result of the proof can be expressed as a &Sigma;2 expression, as follows.

$$x \in L \iff \exists t_1,t_2,...,t_{|r|} \forall r \in R. \bigvee_{ 1 \le i \le |r|} (M(r \oplus t_i) accepts).$$

That is, x is in language L if and only if there exist |r| binary vectors, where for all random bit vectors r, TM M accepts at least one random vector &oplus; ti.

The above expression is in &Sigma;2 in that it is first existentially then universally quantified. Therefore BPP &isin; &Sigma;2. Because BPP is closed under complement, this proves BPP &isin; &Sigma;2&cap;&Pi;2

Lautemann's Proof
Here we present the proof (due to Lautemann) that BPP &isin; &Sigma;2. See Trevisan's notes for more information.

Lemma 3
Based on the definition of BPP we define the following:

If L is in BPP then there is an algorithm A such that for every x, $$Pr_r(A(x,r) = \mbox{right answer}) \ge 1 - \frac{1}{3m}$$ where m is the number of random bits $$|r| = m = |x|^{O(1)}$$ and A runs in time $$|x|^{O(1)}$$ Proof: Let A' be a BPP algorithm for L. For every x, $$Pr_r(A'(x,r) = \mbox{wrong answer}) \le 1/3$$. A' uses m'(n) random bits where n = |x|. Do k(n) repetitions of A' and accept if and only if at least $$\frac{k(n)}{2}$$ executions of A' accept. Define this new algorithm as A. So A uses k(n)m'(n) random bits and $$Pr_r(A(x,r) = \mbox{wrong answer} \le 2^{-ck(n)}$$.  We can then find k(n) with k(n) = $$\theta (log m'(n))$$ such that $$\frac{1}{2^{ck(n)}} \le \frac{1}{3k(n)m'(n)}$$

Theorem 1
Proof: Let L be in BPP and A as in Lemma 3. We want to show $$x \in L \iff \exists y_1,...,y_m \in \{0,1\}^m \forall z \in \{0,1\}^m \bigvee_{i=1}^mA(x,y_i \oplus z)=1$$

where m is the number of random bits used by A on input x. Given $$x \in L$$, then $$Pr_{y_1,...,y_m}(\exists z A(x,y_1 \oplus z)=...=A(x,y_m \oplus z)=0)$$ $$\le \sum_{z \in \{0,1\}^m} Pr_{y1,...,y_m}(A(x,y_1 \oplus z) = ... = A( x, y_m \oplus z) = 0)$$ $$\le 2^m \frac{1}{(3m)^m}$$ $$< 1.$$

So $$Pr_{y_1,...,y_m}( \forall z \bigvee_i A(x,y_i \oplus z))=1 - Pr_{y_1,...,y_m}(\exists z A(x,y_1 \oplus z)=...=A(x,y_m \oplus z)=0)$$ So $$(y_1,...,y_m)$$ exists.

Conversely, suppose $$x \notin L$$. Then

$$Pr_z \left( \bigvee_i A(x,y_i \oplus z) \right) \le \sum_i Pr_z (A(x,y_i \oplus z)=1)$$ $$\le m \frac{1}{3m}$$ $$= \frac{1}{3}$$. So $$Pr_z(A(x,y_1 \oplus z)=...=A(x,y_m \oplus z)=0)= Pr_z \left( \bigvee_i A(x,y_i \oplus z) \right)$$ $$\ge \frac{2}{3} > 0.$$ So there is a z such that $$ \bigvee_i A(x,y_i \oplus z)=0$$ for all $$y_1,...,y_m \in \{0,1\}^m.$$