Minkowski inequality

In mathematical analysis, the Minkowski inequality establishes that the Lp spaces are normed vector spaces. Let $$S$$ be a measure space, let $$1 \leq p < \infty$$ and let $$f$$ and $$g$$ be elements of $$L^p(S).$$ Then $$f + g$$ is in $$L^p(S),$$ and we have the triangle inequality $$\|f+g\|_p \leq \|f\|_p + \|g\|_p$$ with equality for $$1 < p < \infty$$ if and only if $$f$$ and $$g$$ are positively linearly dependent; that is, $$f = \lambda g$$ for some $$\lambda \geq 0$$ or $$g = 0.$$ Here, the norm is given by: $$\|f\|_p = \left(\int |f|^p d\mu\right)^{\frac{1}{p}}$$ if $$p < \infty,$$ or in the case $$p = \infty$$ by the essential supremum $$\|f\|_\infty = \operatorname{ess\ sup}_{x\in S}|f(x)|.$$

The Minkowski inequality is the triangle inequality in $$L^p(S).$$ In fact, it is a special case of the more general fact $$\|f\|_p = \sup_{\|g\|_q = 1} \int |fg| d\mu, \qquad \tfrac{1}{p} + \tfrac{1}{q} = 1$$ where it is easy to see that the right-hand side satisfies the triangular inequality.

Like Hölder's inequality, the Minkowski inequality can be specialized to sequences and vectors by using the counting measure: $$\biggl(\sum_{k=1}^n |x_k + y_k|^p\biggr)^{1/p} \leq \biggl(\sum_{k=1}^n |x_k|^p\biggr)^{1/p} + \biggl(\sum_{k=1}^n |y_k|^p\biggr)^{1/p}$$ for all real (or complex) numbers $$x_1, \dots, x_n, y_1, \dots, y_n$$ and where $$n$$ is the cardinality of $$S$$ (the number of elements in $$S$$).

The inequality is named after the German mathematician Hermann Minkowski.

Proof
First, we prove that $$f + g$$ has finite $$p$$-norm if $$f$$ and $$g$$ both do, which follows by $$|f + g|^p \leq 2^{p-1}(|f|^p + |g|^p).$$ Indeed, here we use the fact that $$h(x) = |x|^p$$ is convex over $$\Reals^+$$ (for $$p > 1$$) and so, by the definition of convexity, $$\left|\tfrac{1}{2} f + \tfrac{1}{2} g\right|^p \leq \left|\tfrac{1}{2} |f| + \tfrac{1}{2} |g|\right|^p \leq \tfrac{1}{2}|f|^p + \tfrac{1}{2} |g|^p.$$ This means that $$|f+g|^p \leq \tfrac{1}{2}|2f|^p + \tfrac{1}{2}|2g|^p = 2^{p-1}|f|^p + 2^{p-1}|g|^p.$$

Now, we can legitimately talk about $$\|f + g\|_p.$$ If it is zero, then Minkowski's inequality holds. We now assume that $$\|f + g\|_p$$ is not zero. Using the triangle inequality and then Hölder's inequality, we find that $$\begin{align} \|f + g\|_p^p &= \int |f + g|^p \, \mathrm{d}\mu \\ &= \int |f + g| \cdot |f + g|^{p-1} \, \mathrm{d}\mu \\ &\leq \int (|f| + |g|)|f + g|^{p-1} \, \mathrm{d}\mu \\ &=\int |f||f + g|^{p-1} \, \mathrm{d}\mu+\int |g||f + g|^{p-1} \, \mathrm{d}\mu \\ &\leq \left(\left(\int |f|^p \, \mathrm{d}\mu\right)^{\frac{1}{p}} + \left(\int |g|^p \,\mathrm{d}\mu\right)^{\frac{1}{p}}\right)\left(\int |f + g|^{(p-1)\left(\frac{p}{p-1}\right)} \, \mathrm{d}\mu\right)^{1-\frac{1}{p}} && \text{ Hölder's inequality} \\ &= \left(\|f\|_p + \|g\|_p \right )\frac{\|f + g\|_p^p}{\|f + g\|_p} \end{align}$$

We obtain Minkowski's inequality by multiplying both sides by $$\frac{\|f + g\|_p}{\|f + g\|_p^p}.$$

Minkowski's integral inequality
Suppose that $$(S_1, \mu_1)$$ and $$(S_2, \mu_2)$$ are two $\sigma$-finite measure spaces and $$F : S_1 \times S_2 \to \Reals$$ is measurable. Then Minkowski's integral inequality is: $$\left[\int_{S_2}\left|\int_{S_1}F(x,y)\, \mu_1(\mathrm{d}x)\right|^p \mu_2(\mathrm{d}y)\right]^{\frac{1}{p}} ~\leq~ \int_{S_1}\left(\int_{S_2}|F(x,y)|^p\,\mu_2(\mathrm{d}y)\right)^{\frac{1}{p}}\mu_1(\mathrm{d}x),$$ with obvious modifications in the case $$p = \infty.$$ If $$p > 1,$$ and both sides are finite, then equality holds only if $$|F(x, y)| = \varphi(x) \, \psi(y)$$ a.e. for some non-negative measurable functions $$\varphi$$ and $$\psi.$$

If $$\mu_1$$ is the counting measure on a two-point set $$S_1 = \{1, 2\},$$ then Minkowski's integral inequality gives the usual Minkowski inequality as a special case: for putting $$f_i(y) = F(i, y)$$ for $$i = 1, 2,$$ the integral inequality gives $$\|f_1 + f_2\|_p = \left(\int_{S_2}\left|\int_{S_1}F(x,y)\,\mu_1(\mathrm{d}x)\right|^p \mu_2(\mathrm{d}y)\right)^{\frac{1}{p}} \leq \int_{S_1}\left(\int_{S_2}|F(x,y)|^p\,\mu_2(\mathrm{d}y)\right)^{\frac{1}{p}} \mu_1(\mathrm{d}x) = \|f_1\|_p + \|f_2\|_p.$$

If the measurable function $$F : S_1 \times S_2 \to \Reals$$ is non-negative then for all $$1 \leq p \leq q \leq \infty,$$ $$\left\|\left\|F(\,\cdot, s_2)\right\|_{L^p(S_1, \mu_1)}\right\|_{L^q(S_2, \mu_2)} ~\leq~ \left\|\left\|F(s_1, \cdot)\right\|_{L^q(S_2, \mu_2)}\right\|_{L^p(S_1, \mu_1)} \ .$$

This notation has been generalized to $$\|f\|_{p,q} = \left(\int_{\R^m} \left[\int_{\R^n}|f(x,y)|^q\mathrm{d}y\right]^{\frac{p}{q}} \mathrm{d}x\right)^{\frac{1}{p}}$$ for $$f : \R^{m+n} \to E,$$ with $$\mathcal{L}_{p,q}(\R^{m+n},E) = \{f \in E^{\R^{m+n}} : \|f\|_{p,q} < \infty\}.$$ Using this notation, manipulation of the exponents reveals that, if $$p < q,$$ then $$\|f\|_{q,p} \leq \|f\|_{p,q}.$$

Reverse inequality
When $$p < 1$$ the reverse inequality holds: $$\|f+g\|_p \ge \|f\|_p + \|g\|_p.$$

We further need the restriction that both $$f$$ and $$g$$ are non-negative, as we can see from the example $$f=-1, g=1$$ and $$p = 1:$$ $$\|f+g\|_1 = 0 < 2 = \|f\|_1 + \|g\|_1.$$

The reverse inequality follows from the same argument as the standard Minkowski, but uses that Holder's inequality is also reversed in this range.

Using the Reverse Minkowski, we may prove that power means with $$p \leq 1,$$ such as the harmonic mean and the geometric mean are concave.

Generalizations to other functions
The Minkowski inequality can be generalized to other functions $$\phi(x)$$ beyond the power function $$x^p.$$ The generalized inequality has the form $$\phi^{-1}\left(\textstyle\sum\limits_{i=1}^n \phi(x_i + y_i)\right) \leq \phi^{-1}\left(\textstyle\sum\limits_{i=1}^n \phi(x_i)\right) + \phi^{-1}\left(\textstyle\sum\limits_{i=1}^n \phi(y_i)\right).$$

Various sufficient conditions on $$\phi$$ have been found by Mulholland and others. For example, for $$x \geq 0$$ one set of sufficient conditions from Mulholland is
 * 1) $$\phi(x)$$ is continuous and strictly increasing with $$\phi(0) = 0.$$
 * 2) $$\phi(x)$$ is a convex function of $$x.$$
 * 3) $$\log\phi(x)$$ is a convex function of $$\log(x).$$