Darmois–Skitovich theorem

In mathematical statistics, the Darmois–Skitovich theorem characterizes the normal distribution (the Gaussian distribution) by the independence of two linear forms from independent random variables. This theorem was proved independently by G. Darmois and V. P. Skitovich in 1953.

Formulation
Let $$\xi_j, j = 1, 2, \ldots, n, n \ge 2$$ be independent random variables. Let $$\alpha_j, \beta_j$$ be nonzero constants. If the linear forms  $$L_1 = \alpha_1\xi_1 + \cdots + \alpha_n\xi_n$$ and $$L_2 = \beta_1\xi_1 + \cdots + \beta_n\xi_n $$ are independent then all random variables $$\xi_j$$ have  normal distributions (Gaussian distributions).

History
The Darmois–Skitovich theorem is a generalization of the Kac–Bernstein theorem in which the normal distribution (the Gaussian distribution) is characterized by the independence of the sum and the difference of two independent random variables. For a history of proving the theorem by V. P. Skitovich, see the article