User:Dmcq/COV

Temporary page to try out some formulae

Brachistochrone 2

 * $$v=\sqrt{-2gy}$$
 * $$\frac{\sin{\phi}}{v}=K$$
 * $$\frac{\sin{\phi}}{\sqrt{-2gy}}=\frac{1}{\sqrt{2gD}}$$
 * $$\sin{\phi}=\frac{dx}{\sqrt{dx^2+dy^2}}$$
 * $$\begin{pmatrix}\frac{dy}{dx}\end{pmatrix}^2=-\left(\frac{D+y}{y}\right)$$
 * $$\left(\frac{dy}{dx}\right)^2 = \frac{2r-y}{y}.$$
 * $$\frac{\cos(\theta/2)}{\sqrt {2g \cdot 2r \cos^2(\theta/2)}} =

\frac{1}{\sqrt {4gr}}$$

Beltrami
Beltrami Identity, $$\partial L/\partial x=0$$


 * $$L-f'\frac{\partial L}{\partial f'}=C$$

Brachistochrone

 * $$v=\sqrt{2gh} \,$$
 * $$h=2r \cos^2(\theta / 2) \,$$


 * $$\frac{(d+x)\cos(\theta / 2)}{\sqrt{2 g x \cos^2(\theta / 2)}} \cdot \frac {d\theta}{2}\,$$
 * $$\frac{d+x}{\sqrt x} \,$$
 * $$\frac{r}{\sqrt{x}} + \sqrt{x} \,$$
 * $$x = d \,$$

Euler-Lagrange

 * $$ J = \int_a^b L(x,f(x),f'(x))\, dx \,\!$$
 * $$f(a)=c, f(b)=d \,$$


 * $$g_\epsilon(x)=f(x)+\epsilon\eta(x) \,$$
 * $$\eta(a)=\eta(b)=0 \,$$


 * $$ J(\epsilon) = \int_a^b L(x,g_\epsilon(x), g_\varepsilon'(x) )\, dx \,\!$$

We now wish to calculate the total derivative of J with respect to &epsilon; or the first variation of J.


 * $$ \frac{\mathrm{d} J}{\mathrm{d} \varepsilon} = \int_a^b \frac{\mathrm{d}L}{\mathrm{d}\epsilon}(x,g_\varepsilon(x), g_\varepsilon'(x) )\, dx $$


 * $$ \frac{\mathrm{d} J}{\mathrm{d} \epsilon} = \int_a^b \left[\eta(x) \frac{\partial L}{\partial g_\varepsilon} + \eta'(x) \frac{\partial L}{\partial g_\varepsilon'} \, \right]\,dx $$

When &epsilon; = 0 we have g&epsilon; = f and since f is an extreme value


 * $$ J'(0) = \int_a^b \left[ \eta(x) \frac{\partial L}{\partial f} + \eta'(x) \frac{\partial L}{\partial f'} \,\right]\,dx = 0$$

The next crucial step is to use integration by parts on the second term, yielding


 * $$ 0 = \int_a^b \left[ \frac{\partial L}{\partial f} - \frac{d}{dx} \frac{\partial L}{\partial f'} \right] \eta(x)\,dx + \left[ \eta(x) \frac{\partial L}{\partial f'} \right]_a^b $$

Using the boundary conditions on &eta;, we get that


 * $$ 0 = \int_a^b \left[ \frac{\partial L}{\partial f} - \frac{d}{dx} \frac{\partial L}{\partial f'} \right] \eta(x)\,dx \,\!$$

Applying the fundamental lemma of calculus of variations now yields the Euler–Lagrange equation


 * $$ 0 = \frac{\partial L}{\partial f} - \frac{d}{dx} \frac{\partial L}{\partial f'} $$


 * $$\frac{\delta J}{\delta y} = L_y - \frac{d}{dt}L_{y'}$$

EL Brach

 * $$ \int_{p_1}^{p_2}{\frac{\sqrt{1 + y'^{\,2}}}{\sqrt{2gy}}\,dx} \,$$


 * $$\frac{1}{\sqrt{1 + y'^{\,2}}{\sqrt{2gy}}} = C \,$$

Fisher Information
Van Trees (1968) and Frieden (2004) provide the following method of deriving the Fisher information informally:

Consider an unbiased estimator $$\hat\theta(X)$$. Mathematically, we write



\mathrm{E}\left[ \hat\theta(X) - \theta \right] = \int \left[ \hat\theta(X) - \theta \right] \cdot f(X ;\theta) \, dx = 0. $$

The likelihood function $$f(X ;\theta)$$ describes the probability that we observe a given sample $$x$$ given a known value of $$\theta$$. If $$f$$ is sharply peaked, it is easy to intuit the "correct" value of $$\theta$$ given the data, and hence the data contains a lot of information about the parameter. If the likelihood $$f$$ is flat and spread-out, then it would take many, many samples of $$X$$ to estimate the actual "true" value of $$\theta$$. Therefore, we would intuit that the data contain much less information about the parameter.

Now, given the unbiased-ness condition above, we differentiate it to get



\frac{\partial}{\partial\theta} \int \left[ \hat\theta(X) - \theta \right] \cdot f(X ;\theta) \, dx = \int \left(\hat\theta-\theta\right) \frac{\partial f}{\partial\theta} \, dx - \int f \, dx = 0. $$

We now make use of two facts. The first is that the likelihood $$f$$ is just the probability of the data given the parameter. Since it is a probability, it must be normalized, implying that


 * $$\int f \, dx = 1.$$

Second, we know from basic calculus that


 * $$\frac{\partial f}{\partial\theta} = f \, \frac{\partial \ln f}{\partial\theta}$$.

Using these two facts in the above let us write



\int \left(\hat\theta-\theta\right) f \, \frac{\partial \ln f}{\partial\theta} \, dx = 1. $$

Factoring the integrand gives



\int \left(\left(\hat\theta-\theta\right) \sqrt{f} \right) \left( \sqrt{f} \, \frac{\partial \ln f}{\partial\theta} \right) \, dx = 1. $$

If we square the equation, the Cauchy-Schwarz inequality lets us write



\left[ \int \left(\hat\theta - \theta\right)^2 f \, dx \right] \cdot \left[ \int \left( \frac{\partial \ln f}{\partial\theta} \right)^2 f \, dx \right] \geq 1. $$

The right-most factor is defined to be the Fisher Information



\mathcal{I}\left(\theta\right) = \int \left( \frac{\partial \ln f}{\partial\theta} \right)^2 f \, dx. $$

The left-most factor is the expected mean-squared error of the estimator $$\theta$$, since



\mathrm{E}\left[ \left( \hat\theta\left(X\right) - \theta \right)^2 \right] = \int \left(\hat\theta - \theta\right)^2 f \, dx. $$

Notice that the inequality tells us that, fundamentally,



\mbox{Var}\left[\hat\theta\right] \, \geq \, {1} / {\mathcal{I}\left(\theta\right)}. $$

In other words, the precision to which we can estimate $$\theta$$ is fundamentally limited by the Fisher Information of likelihood function.


 * $$\frac{\delta \mathcal{S}}{\delta \varphi_i} = 0\,.$$

where the action, S, is a functional of the dependent variables $$\varphi_i(s)$$ with their derivatives and s itself


 * $$\mathcal{S}\left[\varphi_i, \frac{\partial \varphi_i} {\partial s}\right] = \int{ \mathcal{L} \left[\varphi_i [s], \frac{\partial \varphi_i [s]}{\partial s^\alpha}, s^\alpha\right] \, \mathrm{d}^n s }$$

Tautochrone solution Via 'virtual gravity'

 * $$s'' = - k^2s \,$$
 * $$s = A \sin kt \,$$
 * $$T = 2 \pi / k \,$$
 * $$\textstyle g \cos \frac{\theta}{2} = k^2 s \,$$
 * $$\textstyle \frac{g}{2} \sin \frac{\theta}{2} d\theta = -k^2 ds \,$$
 * $$\textstyle ds = - \frac{g}{2 k^2} \sin {\frac \theta 2} d \theta \,$$
 * $$\textstyle ds = - 4 r sin {\frac \theta 2} d \frac{\theta}{2} \,$$
 * $$\textstyle k = \sqrt { \frac{g}{4 r}} \,$$

Cycloid equations


\begin{matrix} x & = & r ( \theta - \sin \theta ) \\s y & = & r ( 1 - \cos \theta ) \end{matrix} $$

Tractrix
$$p = e^{i\theta} \,$$

$$q = p(1+ a e^{i\phi})\,$$

$$q - p = a e^{i(\theta + \phi)}\,$$

$$p^\prime = i e^{i\theta} \,$$

$$q^\prime = i e^{i\theta} (1+ a e^{i\phi} + a \phi^\prime  e^{i\phi} )\,$$

$$q^\prime \varpropto q-p $$

$$\frac {\operatorname{Re} (q^\prime)} {\operatorname{Im} (q^\prime)} = \frac {\operatorname{Re} (q-p)} {\operatorname{Im} (q-p)} $$

$$\operatorname{Re} (q^\prime) \operatorname{Im} (q-p) - \operatorname{Im} (q^\prime) \operatorname{Re} (q-p) = 0 $$

$$-\operatorname{Re} (q^\prime) \operatorname{Im} (\overline {q-p}) - \operatorname{Im} (q^\prime) \operatorname{Re} (\overline {q-p} ) = 0 $$

$$ \operatorname{Im} \left\{ (i e^{i\theta} (1+ a e^{i\phi} + a \phi^\prime e^{i\phi} )) ( \overline { a e^{i(\theta + \phi) } }) \right\} = 0$$

$$ \operatorname{Im} \left\{ (i e^{i\theta} (1+ a e^{i\phi} + a \phi^\prime e^{i\phi} )) (  a e^{-i(\theta + \phi) }) \right\} = 0$$

$$ \operatorname{Re} \left\{ a e^{-i\phi} (1+ a e^{i\phi} + a \phi^\prime e^{i\phi} )) ) \right\} = 0$$

$$ \operatorname{Re} \left\{ e^{-i\phi} + a + a \phi^\prime \right\} = 0$$

$$ \cos (-\phi) + a + a \phi^\prime = 0$$

$$\phi^\prime = - (\cos \phi + a) $$

$$\frac {\phi^\prime} {\cos \phi + a} = -1 $$

$$\int {\frac {\phi^\prime \operatorname{d} \phi} {\cos \phi + a}} = -\theta + C $$

$$2 \tan^{-1} \left[ \frac {\sqrt {(1-a^2) \tanh \left( \frac {-x \sqrt {1-a^2}} {2a} \right) }} {1-a} \right] $$

$$2 \tan^{-1}(-x) \,$$

$$2 \tan^{-1} \left[ \frac {\sqrt {(a^2-1) \tan \left( \frac {-x \sqrt {a^2-1}}{2a} \right) }} {a-1} \right] $$

Get a nice figure if

$$a = \sqrt { \frac {1} {1 - r^2}}$$