User:Quietbritishjim/Maths scratch

Differentiation under integral sign
Suppose $$f: \mathbb{R} \times \mathbb{R}^{d}\rightarrow \mathbb{R} $$ satisfies the following conditions:


 * (1) $$f(t,x)$$ is a Lebesgue-integrable function of $$x$$ for each $$t \in \mathbb{R}$$


 * (2) For almost all $$x \in \mathbb{R}^{d}$$, the derivative $$f_t$$ exists for all $$t \in \mathbb{R}$$


 * (3) There is an integrable function $$ \theta: \mathbb{R}^{d}\rightarrow \mathbb{R}$$ such that $$|f_t(t,x)| \leq \theta (x)$$ for all $$t \in \mathbb{R}$$

Then for all $$t \in \mathbb{R}$$
 * $$ \frac{\mathrm{d}}{\mathrm{d} t} \int_{\mathbb{R}^{d}} \, f(t, x) \mathrm{d} x = \int_{\mathbb{R}^{d}} \, f_t (t, x) \mathrm{d} x $$

Integration by parts lead
In calculus, and more generally in mathematical analysis, integration by parts is a theorem that relates the integral of a product of functions to the integral of their derivative and antiderivative. It is frequently used to find the antiderivative of a product of functions into an ideally simpler antiderivative. The rule can be derived in one line by simply integrating the product rule of differentiation.

The theorem states that if u and v are continuously differentiable functions then


 * $$\int u(x) v'(x) \, dx = u(x) v(x) - \int u'(x) v(x) \, dx.$$

It can be stated more compactly using the differentials du = u′(x) dx and dv = v′(x) dx as


 * $$\int u \, dv=uv-\int v \, du.\!$$

More general formulations of integration by parts exist for the Riemann–Stieltjes integral and Lebesgue–Stieltjes integral. A discrete analogue holds for sequences, called summation by parts.

Integration by parts
The formula for integration by parts can be extended to functions of several variables. Instead of an interval one needs to integrate over an n-dimensional set. Also, one replaces the derivative with a partial derivative.

More specifically, suppose Ω is an open bounded subset of $$\mathbb{R}^n$$ with a piecewise smooth boundary $$\Gamma$$. If u and v are two continuously differentiable functions on the closure of Ω, then the formula for integration by parts is
 * $$\int_{\Omega} \frac{\partial u}{\partial x_i} v \,d\Omega = \int_{\Gamma} u v \, \nu_i \,d\Gamma - \int_{\Omega} u \frac{\partial v}{\partial x_i} \, d\Omega$$

where $$\hat{\mathbf{\nu}}$$ is the outward unit surface normal to $$\Gamma$$, $$\mathbf{\nu}_i$$ is its i-th component, and i ranges from 1 to n.

We can obtain a more general form of the integration by parts by replacing v in the above formula with vi and summing over i gives the vector formula
 * $$ \int_{\Omega} \nabla u \cdot \mathbf{v}\, d\Omega = \int_{\Gamma} (u\, \mathbf{v})\cdot \hat{\nu}\, d\Gamma -  \int_\Omega u\, \nabla\cdot\mathbf{v}\, d\Omega$$

where v is a vector-valued function with components v1, ..., vn.

Setting u equal to the constant function 1 in the above formula gives the divergence theorem
 * $$ \int_{\Gamma} \mathbf{v} \cdot \hat{\nu}\, d\Gamma =  \int_\Omega \nabla\cdot\mathbf{v}\, d\Omega$$.

For $$\mathbf{v}=\nabla v$$ where $$v\in C^2(\bar{\Omega})$$, one gets
 * $$ \int_{\Omega} \nabla u \cdot \nabla v\, d\Omega = \int_{\Gamma} u\, \nabla v\cdot\hat{\nu}\, d\Gamma - \int_\Omega u\, \Delta v\, d\Omega$$

which is the first Green's identity.

The regularity requirements of the theorem can be relaxed. For instance, the boundary $$\Gamma$$ need only be Lipschitz continuous. In the first formula above, only $$u,v\in H^1(\Omega)$$ is necessary (where H1 is a Sobolev space); the other formulas have similarly relaxed requirements.

multi-index notation
$$ \begin{align} (x+y)^{\alpha} & =(x_{1}+y_{1})^{\alpha_{1}}\dotsm(x_{d}+y_{d})^{\alpha_{d}}\\ & =\bigg(\sum_{k_{1}=0}^{\alpha_{1}}\binom{\alpha_{1}}{k_{1}}\, x_{1}^{k_{1}}y_{1}^{\alpha_{1}-k_{1}}\bigg)\;\dotsc\;\bigg(\sum_{k_{d}=0}^{\alpha_{d}}\binom{\alpha_{1}}{k_{1}}\, x_{d}^{k_{d}}y_{d}^{\alpha_{d}-k_{d}}\bigg)\\ & =\sum_{k_{1}=0}^{\alpha_{1}}\dotsm\sum_{k_{d}=0}^{\alpha_{d}} \binom{\alpha_{1}}{k_{1}}\, x_{1}^{k_{1}}y_{1}^{\alpha_{1}-k_{1}}\;\dotsc\;\binom{\alpha_{1}}{k_{1}}\, x_{d}^{k_{d}}y_{d}^{\alpha_{d}-k_{d}} \\ & =\sum_{\nu\leq\alpha}\binom{\alpha}{\nu}\, x^{\nu}y^{\alpha-\nu} \end{align} $$

$$\frac{d^m}{dt^m} f(x+ty) = \sum_{|\alpha|=m} \frac{|\alpha|!}{\alpha!}(\partial^\alpha f(x + ty))y^\alpha$$

i.e. $$\frac{d^m}{dt^m} f(x+ty) = \sum_{|\alpha|=m} \binom{m}{\alpha}(\partial^\alpha f(x + ty))y^\alpha$$

from Partial differential equations by Fritz John

directional derivative

$$ \begin{align} f(\mathbf x+\mathbf h) & =\sum_{\left\vert\alpha\right\vert\leq n} \frac{1}{\alpha!}\mathbf h^{\alpha}\partial^{\alpha}f(\mathbf x)+\sum_{\left\vert\alpha\right\vert=n+1}\frac{n+1}{\alpha!}\mathbf h^{\alpha}\int_{0}^{1}(1-t)^{n}\partial^{\alpha}f(\mathbf x+t\mathbf h)\, d t \\ & =\sum_{r=0}^{n}\frac{1}{r!}\bigl[(\mathbf h\cdot\nabla)^{r}f\bigr](\mathbf x)+\frac{1}{n!}\int_{0}^{1}(1-t)^{n}\bigl[(\mathbf h\cdot\nabla)^{n+1}f\bigr](\mathbf x+t\mathbf h)\, d t. \end{align} $$

Chain rule
$$\mathbf{h}\colon\mathbb{C}\to\mathbb{C}^n$$ (or $$\mathbb{R}\to\mathbb{R}^n$$).


 * $$ \frac{\mathrm{d}^n}{\mathrm{d}t^n}f(\mathbf{h}(t)) = \sum_{|\alpha|=n} \binom{n}{\alpha} \partial^\alpha f(\mathbf{h}(t)) \,\mathbf{h}'(t). $$

But this is only true if h is linear! Otherwise we'd need terms with higher derivatives in h (and lower derivatives in f).

Product of Hermite polynomials
From http://www.math.niu.edu/~rusin/known-math/99/prod_hermite:


 * $$H_m(x)H_n(x) =\sum_{i=0}^{\min(m,n)} \left(\frac{2^i}{i!} \frac{m!}{(m-i)!} \frac{n!}{(n-i)!} H_{m+n-2i}(x)\right)$$