User:Quietbritishjim/Sokhotsky's formula

I started writing the following when I thought that there was no Wikipedia article on the subject. I later discovered that there already was, but I hadn't found it earlier because it was under the incorrect title of Sokhatsky–Weierstrass theorem. It is now correctly titled as Sokhotski–Plemelj theorem. Perhaps one day I'll move some of the material there.

In mathematics, Sokhotsky's formula (also known as the Plemelj formula, or Plemelj-Sokhotsky formula) relates two ways of integrating over the singularity of the reciprocal function $$1/x$$ on the real numbers. The formula says, in terms of distributions, that


 * $$\operatorname{v.\!p.}\frac{1}{x}\mp i\pi \delta(x)=\frac{1}{x\pm i0}.$$

The distribution on each side of this equality is the Fourier transform of the Heaviside step function.

Breakdown
Throughout this section φ is an arbitrary function in the Schwartz space.

The formula involves three separate distributions:


 * The Dirac delta function, defined as
 * $$\langle \delta, \varphi\rangle := \varphi(0).\!$$


 * The Cauchy principal value, defined as
 * $$\left\langle \operatorname{v.\!p.}\frac{1}{x}, \varphi\right\rangle := \lim_{\varepsilon\rightarrow 0+} \left[\int_{-\infty}^{-\varepsilon}\frac{\varphi(x)}{x}\,dx +\int_{\varepsilon}^{\infty} \frac{\varphi(x)}{x}\,dx\right], \!$$
 * which is often written more compactly as
 * $$\lim_{\varepsilon\rightarrow 0+} \left[\left(\int_{-\infty}^{-\varepsilon}+\int_{\varepsilon}^{\infty}\right)\frac{\varphi(x)}{x}\,dx \right].\!$$


 * The distribution defined as
 * $$\left\langle \frac{1}{x+i0}, \varphi\right\rangle := \lim_{\varepsilon\rightarrow 0+}\int_{-\infty}^{\infty}\frac{\varphi(x)}{x+i\varepsilon}\,dx.\!$$

Substituting in these definitions we obtain, explicitly in terms of integrals and limits, that


 * $$\lim_{\varepsilon\rightarrow 0+} \left[\left(\int_{-\infty}^{-\varepsilon}+\int_{\varepsilon}^{\infty}\right)\frac{\varphi(x)}{x}\,dx \right]\mp i\pi \varphi(0) = \lim_{\varepsilon\rightarrow 0+}\int_{-\infty}^{\infty}\frac{\varphi(x)}{x\pm i\varepsilon}\,dx.\!$$

Proof
This proof is from Choquet-Bruhat, DeWitt-Morette & Dillard-Bleick (1982). We show the top choice of signs; the other choice is similar.

Summary
The proof is as follows, where limits and derivatives are in the distributional sense, and H is the Heaviside step function.
 * $$\begin{align}

\frac{1}{x+ i0} &= \lim_{\varepsilon\rightarrow 0+}\frac{1}{x+i\varepsilon} & & \text{(1)} \\ &=\lim_{\varepsilon\rightarrow 0+}\frac{\mathrm{d}}{\mathrm{d}x}\log(x+i\varepsilon) & & \text{(2)} \\ &=\frac{\mathrm{d}}{\mathrm{d}x}\lim_{\varepsilon\rightarrow 0+}\log(x+i\varepsilon) & & \text{(3)} \\ &=\frac{\mathrm{d}}{\mathrm{d}x}(\log\vert x\vert +i\pi(1- H(x))) & & \text{(4)} \\ &=\operatorname{v.\!p.}\frac{1}{x}- i\pi \delta(x) & & \text{(5)} \end{align}$$

The explanation of each equality, and the choice of logarithm branch, is given in the following subsection.

Details
Step 1 By definition of the limit of a distribution,
 * $$ \left\langle \lim_{\varepsilon\rightarrow 0+}\frac{1}{x+i\varepsilon}, \varphi\right\rangle = \lim_{\varepsilon\rightarrow 0+} \left\langle \frac{1}{x+i\varepsilon}, \varphi\right\rangle.$$

The right hand side of this is the definition of the distribution $$1/(x+i0)$$.

Step 2 We need to show that, in the distributional sense,
 * $$\frac{\mathrm{d}}{\mathrm{d}x}\log(x+i\varepsilon)=\frac{1}{x+i\varepsilon}.$$

We take $$\log$$ to be the branch of the natural logarithm such that, for $$x\in\mathbb{R}$$ and $$\varepsilon>0$$,
 * $$ \log(x+i\varepsilon) = \log\vert x+i\varepsilon\vert + i\arg(x+i\varepsilon),\quad\text{where }0\leq\arg(x+i\varepsilon)\leq\pi.\!$$

This is a continuous, and so smooth, choice of logarithm for these values of x and ε. The above derivative therefore holds pointwise, and so also holds in the distributional sense.

Step 3 Swapping the distributional derivative and limit is permitted because the distributional derivative is continuous.

Step 4 We need to show that, in the distributional sense,
 * $$\lim_{\varepsilon\rightarrow 0+}\log(x+i\varepsilon)=\log\vert x\vert +i\pi(1- H(x)).$$

Pointwise we have
 * $$\lim_{\varepsilon\rightarrow 0+}\log(x+i\varepsilon)=\begin{cases}

\log\vert x\vert & \text{if }y>0, \\ \log\vert x\vert +i\pi & \text{if }y<0, \end{cases}$$ so the desired equation holds pointwise. Since $$\log\vert x\vert$$ is locally integrable, it also holds in the distributional sense.

Step 5 Note that the derivative of a constant is zero, and the derivative of the Heaviside step function is the Dirac delta function. It remains only to show that for any test function $$\varphi\in\mathcal{S}(\mathbb{R})$$
 * $$ \left\langle\frac{\mathrm{d}}{\mathrm{d}x}\log\vert x\vert, \varphi\right\rangle = \left\langle\operatorname{v.\!p.}\frac{1}{x},\varphi\right\rangle.$$

The left hand side of this is
 * $$ -\int_{-\infty}^\infty \log \vert x\vert \varphi'(x)\,dx.$$

But
 * $$ \left\vert\lim_{\delta\rightarrow 0+}\int_{-\delta}^{\delta}\log \vert x\vert \varphi'(x)\,dx\right\vert \leq \lim_{\delta\rightarrow 0+} \Vert \varphi'\Vert_\infty \int_{-\delta}^{\delta}\log \vert x\vert\,dx=0$$

since φ is a Schwarz function and log is locally integrable. Thus the expression equals
 * $$ -\lim_{\delta\rightarrow 0+}\left[\left(\int_{-\infty}^{-\delta} +\int_\delta^\infty \right) \log \vert x\vert \varphi'(x)\,dx\right].$$

Integrating by parts this equals
 * $$ \lim_{\delta\rightarrow 0+}\left[-\left(\varphi'(-\delta)\log\delta-\varphi'(\delta)\log\delta\right)+\left(\int_{-\infty}^{-\delta} +\int_\delta^\infty \right) \frac{\varphi(x)}{x}\,dx\right].$$

But
 * $$ \lim_{\delta\rightarrow 0+}\left(\varphi'(-\delta)\log\delta-\varphi'(\delta)\log\delta\right)=\lim_{\delta\rightarrow 0+}\left(2\delta\,\varphi''(\eta_\delta)\log\delta\right)=0$$

by the mean value theorem (where $$\eta_\delta\in[-\delta,\delta]$$ for each δ). Thus the expression equals
 * $$\left\langle\operatorname{v.\!p.}\frac{1}{x},\varphi\right\rangle,$$

as required.