User:Mumtaziah/Moment (mathematics)

Problem of moments
"Main article: Moment problem"Problems of determining a probability distribution from its sequence of moments are called problem of moments. Such problems were first discussed by P.L. Chebyshev (1874) in connection with research on limit theorems. In order that the probability distribution of a random variable $$X$$ be uniquely defined by its moments $$\alpha_k = EX^k$$ it is sufficient, for example, that Carleman's condition be satisfied:"$\textstyle \sum_{k} \displaystyle 1^\infin\frac{1}{x_{2k}^{1/2k}} = \infin$|undefined"A similar result even holds for moments of random vectors. The problem of moments seeks characterizations of sequences $${{\mu_n}^': n = 1,2,3,..}$$that are sequences of moments of some function f, all moments $$\alpha_k(n)$$ of which are finite, and for each integer $$k\geq1$$ let"$\alpha_k(n)\rightarrow \alpha_k ,n\rightarrow \infin$,"where $$\alpha_k$$ is finite. Then there is a sequence $${\mu_n}^'$$that weakly converges to a distribution function $${\mu}$$ having $$\alpha_k$$ as its moments. If the moments determine $${\mu}$$ uniquely, then the sequence $${\mu_n}^'$$weakly converges to $${\mu}$$.