Rouché's theorem

Rouché's theorem, named after Eugène Rouché, states that for any two complex-valued functions $f$ and $g$ holomorphic inside some region $$K$$ with closed contour $$\partial K$$, if $|g(z)| < |f(z)|$ on $$\partial K$$, then $f$ and $f + g$ have the same number of zeros inside $$K$$, where each zero is counted as many times as its multiplicity. This theorem assumes that the contour $$\partial K$$ is simple, that is, without self-intersections. Rouché's theorem is an easy consequence of a stronger symmetric Rouché's theorem described below.

Usage
The theorem is usually used to simplify the problem of locating zeros, as follows. Given an analytic function, we write it as the sum of two parts, one of which is simpler and grows faster than (thus dominates) the other part. We can then locate the zeros by looking at only the dominating part. For example, the polynomial $$z^5 + 3z^3 + 7$$ has exactly 5 zeros in the disk $$|z| < 2$$ since $$|3z^3 + 7| \le 31 < 32 = |z^5|$$ for every $$|z| = 2$$, and $$z^5$$, the dominating part, has five zeros in the disk.

Geometric explanation


It is possible to provide an informal explanation of Rouché's theorem.

Let C be a closed, simple curve (i.e., not self-intersecting). Let h(z) = f(z) + g(z). If f and g are both holomorphic on the interior of C, then h must also be holomorphic on the interior of C. Then, with the conditions imposed above, the Rouche's theorem in its original (and not symmetric) form says that

Notice that the condition |f(z)| > |h(z) &minus; f(z)| means that for any z, the distance from f(z) to the origin is larger than the length of h(z) &minus; f(z), which in the following picture means that for each point on the blue curve, the segment joining it to the origin is larger than the green segment associated with it. Informally we can say that the blue curve f(z) is always closer to the red curve h(z) than it is to the origin.

The previous paragraph shows that h(z) must wind around the origin exactly as many times as f(z). The index of both curves around zero is therefore the same, so by the argument principle, $|f(z)| > |h(z) &minus; f(z)|$ and $f(z)$ must have the same number of zeros inside $C$.

One popular, informal way to summarize this argument is as follows: If a person were to walk a dog on a leash around and around a tree, such that the distance between the person and the tree is always greater than the length of the leash, then the person and the dog go around the tree the same number of times.

Bounding roots
Consider the polynomial $$z^2 + 2az + b^2$$ with $$a > b > 0$$. By the quadratic formula it has two zeros at $$-a \pm \sqrt{a^2 - b^2}$$. Rouché's theorem can be used to obtain some hint about their positions. Since $$|z^2 + b^2| \le 2b^2 < 2a|z| \text{ for all } |z| = b,$$

Rouché's theorem says that the polynomial has exactly one zero inside the disk $$|z| < b$$. Since $$-a - \sqrt{a^2 - b^2}$$ is clearly outside the disk, we conclude that the zero is $$-a + \sqrt{a^2 - b^2}$$.

In general, a polynomial $$f(z) = a_n z^n + \cdots + a_0$$. If $$|a_k| r^k > \sum_{j\neq k}|a_j| r^j$$ for some $$r > 0, k \in 0:n$$, then by Rouche's theorem, the polynomial has exactly $$k$$ roots inside $$B(0, r)$$.

This sort of argument can be useful in locating residues when one applies Cauchy's residue theorem.

Fundamental theorem of algebra
Rouché's theorem can also be used to give a short proof of the fundamental theorem of algebra. Let $$p(z) = a_0 + a_1z + a_2 z^2 + \cdots + a_n z^n, \quad a_n \ne 0$$ and choose $$R > 0$$ so large that: $$|a_0 + a_1z + \cdots + a_{n-1} z^{n-1}| \le \sum_{j=0}^{n - 1} |a_j| R^j < |a_n| R^n = |a_n z^n| \text{ for } |z| = R.$$ Since $$a_n z^n$$ has $$n$$ zeros inside the disk $$|z| < R$$ (because $$R>0$$), it follows from Rouché's theorem that $$p$$ also has the same number of zeros inside the disk.

One advantage of this proof over the others is that it shows not only that a polynomial must have a zero but the number of its zeros is equal to its degree (counting, as usual, multiplicity).

Another use of Rouché's theorem is to prove the open mapping theorem for analytic functions. We refer to the article for the proof.

Symmetric version
A stronger version of Rouché's theorem was published by Theodor Estermann in 1962. It states: let $$K\subset G$$ be a bounded region with continuous boundary $$\partial K$$. Two holomorphic functions $$f,\,g\in\mathcal H(G)$$ have the same number of roots (counting multiplicity) in $$K$$, if the strict inequality $$|f(z)-g(z)|<|f(z)|+|g(z)| \qquad \left(z\in \partial K\right)$$ holds on the boundary $$\partial K.$$

The original version of Rouché's theorem then follows from this symmetric version applied to the functions $$f+g,f$$ together with the trivial inequality $$|f(z)+g(z)| \ge 0$$ (in fact this inequality is strict since $$f(z)+g(z) = 0$$ for some $$z\in\partial K$$ would imply $$|g(z)| = |f(z)|$$).

The statement can be understood intuitively as follows. By considering $$-g$$ in place of $$g$$, the condition can be rewritten as $$|f(z) + g(z)|<|f(z)|+|g(z)|$$ for $$z\in \partial K$$. Since $$|f(z) + g(z)| \leq |f(z)|+|g(z)|$$ always holds by the triangle inequality, this is equivalent to saying that $$|f(z) + g(z)| \neq |f(z)|+|g(z)|$$ on $$\partial K$$, which in turn means that for $$z\in\partial K$$ the functions $$f(z)$$ and $$g(z)$$ are non-vanishing and $$\arg{f(z)} \neq \arg{g(z)}$$.

Intuitively, if the values of $$f$$ and $$g$$ never pass through the origin and never point in the same direction as $$z$$ circles along $$\partial K$$, then $$f(z)$$ and $$g(z)$$ must wind around the origin the same number of times.

Proof of the symmetric form of Rouché's theorem
Let $$C\colon[0,1]\to\mathbb C$$ be a simple closed curve whose image is the boundary $$\partial K$$. The hypothesis implies that f has no roots on $$\partial K$$, hence by the argument principle, the number Nf(K) of zeros of f in K is $$\frac1{2\pi i}\oint_C\frac{f'(z)}{f(z)}\,dz=\frac1{2\pi i}\oint_{f\circ C} \frac{dz}z =\mathrm{Ind}_{f\circ C}(0),$$ i.e., the winding number of the closed curve $$f\circ C$$ around the origin; similarly for g. The hypothesis ensures that g(z) is not a negative real multiple of f(z) for any z = C(x), thus 0 does not lie on the line segment joining f(C(x)) to g(C(x)), and $$H_t(x) = (1-t)f(C(x)) + t g(C(x))$$ is a homotopy between the curves $$f\circ C$$ and $$g\circ C$$ avoiding the origin. The winding number is homotopy-invariant: the function $$I(t)=\mathrm{Ind}_{H_t}(0)=\frac1{2\pi i}\oint_{H_t}\frac{dz}z$$ is continuous and integer-valued, hence constant. This shows $$N_f(K)=\mathrm{Ind}_{f\circ C}(0)=\mathrm{Ind}_{g\circ C}(0)=N_g(K).$$