User:Durdenclub~enwiki

I'm just yet another postdoc researcher lost somewhere between mathematics and signal processing.

= My infamous but useful mathematical formula list = Some equations useful in my research. Most of them can be collected from wikipedia.

Some come from specific articles.

Don't trust me if you use them!

Use them to your own risk and double-check them with another (official/reviewed) source.

Varia

 * $$\sum_{k=n+1}^\infty {1 \over k^2}\leq {1 \over n}$$
 * $$\ln(1-x) \leq -x -x^2/2,\quad \forall x\geq 0$$
 * $$\ln(1+x) \leq x -x^2/2 + x^3/3,\quad \forall x \geq 0$$
 * $${x \over 1 - x} + \ln(1-x) \geq x^2/2,\quad \forall x \in [0, 1)$$ (see Math 710: Measure Concentration, A. Barvinok)
 * $$x + \ln(1-x) \leq -x^2/2,\quad \forall x \in [0, 1)$$ (see Math 710: Measure Concentration, A. Barvinok)

Combinatorics

 * $$ {n \choose k} \le \frac{n^k}{k!} $$
 * $$ {n \choose k} \le \left(\frac{n\cdot e}{k}\right)^k $$ (useful for combinatorial union bounds)
 * $$ {n \choose k} \le e^{\,n\,H(k/n)},\ {\rm where}\ H(q)=-q\log q - (1-q)\log(1-q)$$ is the Binary entropy (for 0<q<1).
 * $$ {n \choose k} \ge \left(\frac{n}{k}\right)^k$$ (more)

From Sondow (2005) [1] and Sondow and Zudilin (2006) [2], and here on Nuit-Blanche: 
 * $$ {n \choose k} \le \left(\frac{n}{k}\right)^k\left(\frac{n}{n-k}\right)^{n-k} $$ (useful for combinatorial union bounds)
 * $$ {n \choose k} \ge \frac{1}{4(n-k)}\,\left(\frac{n}{k}\right)^k\left(\frac{n}{n-k}\right)^{n-k} $$ (useful for combinatorial union bounds)

[1] Sondow, J. "Problem 11132." Amer. Math. Monthly 112, 180, 2005.

[2] Sondow, J. and Zudilin, W. "Euler's Constant, q-Logarithms, and Formulas of Ramanujan and Gosper." Ramanujan J. 12, 225-244, 2006.

Misc
(see "Database friendly random projection" by Achlioptas, or "An elementary proof of JL Lemma", by S. Dasgupta and A. Gupta)
 * If $$X \thicksim N(0,1)$$, then $$E[e^{tX^2}]= 1/\sqrt{1-2t}$$ for $$t\in(-\infty,{1\over 2})$$.

Markov Inequality
In the language of measure theory, Markov's inequality states that if (X,&Sigma;,&mu;) is a measure space, f is a measurable extended real-valued function, and t > 0, then


 * $$ \mu(\{x\in X|\,|f(x)|\geq t\}) \leq {1\over t}\int_X |f|\,d\mu.$$

For the special case where the space has measure 1 (i.e., it is a probability space), it can be restated as follows: if X is any random variable and a > 0, then


 * $$\textrm{Pr}(|X| \geq a) \leq \frac{\textrm{E}(|X|)}{a}.$$

Laplace transform
In the language of measure theory (see also Math 710: Measure Concentration, A. Barvinok), if (X,&Sigma;,&mu;) is a measure space, f is a measurable extended real-valued function, and t > 0, then


 * $$ \mu(\{x\in X|\, f(x)\geq a\}) \leq e^{-t a} \int_X e^{tf}\,d\mu.$$
 * $$ \mu(\{x\in X|\, f(x)\leq a\}) \leq e^{t a} \int_X e^{-tf}\,d\mu.$$