User:Geomon

Combining unbiased estimators
Let $${{\hat{\phi }}_{1}}$$ and $${{\hat{\phi }}_{2}}$$ be unbiased estimators of $$\phi \in {{\mathbb{R}}^{k}}$$ with non-singular variances $${{V}_{1}}$$ and $${{V}_{2}}$$ respectively.

Then the minimum variance linear unbiased estimator of $$\phi $$ is obtained by combining $${{\hat{\phi }}_{1}}$$ and $${{\hat{\phi }}_{2}}$$ using weights that are proportional to the inverses of their variances. The result can be expressed in a variety of ways:

$$\begin{align} \hat{\phi } &= {{\left( V_{1}^{-1}+V_{2}^{-1} \right)}^{-1}}\left( V_{1}^{-1}{{{\hat{\phi }}}_{1}}+V_{2}^{-1}{{{\hat{\phi }}}_{2}} \right) \\ & = {{\left( V_{1}^{-1}+V_{2}^{-1} \right)}^{-1}}\left( V_{1}^{-1}{{{\hat{\phi }}}_{1}}+V_{2}^{-1}{{{\hat{\phi }}}_{2}} \right)+ & \left[ {{\left( V_{1}^{-1}+V_{2}^{-1} \right)}^{-1}}V_{2}^{-1}{{{\hat{\phi }}}_{1}}-{{\left( V_{1}^{-1}+V_{2}^{-1} \right)}^{-1}}V_{2}^{-1}{{{\hat{\phi }}}_{1}} \right] \\ & = {{{\hat{\phi }}}_{1}}+ {{\left( V_{1}^{-1}+V_{2}^{-1} \right)}^{-1}}V_{2}^{-1}\left( {{{\hat{\phi }}}_{2}}-{{{\hat{\phi }}}_{1}} \right) \\ & = {{{\hat{\phi }}}_{1}}+ {{\left( I+{{V}_{2}}V_{1}^{-1} \right)}^{-1}}\left( {{{\hat{\phi }}}_{2}}-{{{\hat{\phi }}}_{1}} \right) \\ & = {{\left( I+{{V}_{1}}V_{2}^{-1} \right)}^{-1}}\left( {{{\hat{\phi }}}_{1}}+{{V}_{1}}V_{2}^{-1}{{{\hat{\phi }}}_{2}} \right) \end{align}$$ The proof is an application of the principle of Generalized Least-Squares. The problem can be formulated as a GLS problem by considering that: $$\left[ \begin{matrix} {{{\hat{\phi }}}_{1}} \\ {{{\hat{\phi }}}_{2}} \\ \end{matrix} \right]=\left[ \begin{matrix} I \\ I \\ \end{matrix} \right]\phi +\left[ \begin{matrix} {{\varepsilon }_{1}} \\ {{\varepsilon }_{1}} \\ \end{matrix} \right]$$ with $$\operatorname{Var}\left( \left[ \begin{matrix}  {{\varepsilon }_{1}}  \\   {{\varepsilon }_{1}}  \\ \end{matrix} \right] \right)=\left[ \begin{matrix} {{V}_{1}} & 0 \\ 0 & {{V}_{2}} \\ \end{matrix} \right]$$

Applying the GLS formula yields: $$\begin{align} \hat{\phi } & ={{\left( {{\left[ \begin{matrix}  I  \\   I  \\ \end{matrix} \right]}^{\prime }}{{\left[ \begin{matrix}   {{V}_{1}} & 0  \\   0 & {{V}_{2}}  \\ \end{matrix} \right]}^{-1}}\left[ \begin{matrix}   I  \\   I  \\ \end{matrix} \right] \right)}^{-1}}{{\left[ \begin{matrix} I \\ I \\ \end{matrix} \right]}^{\prime }}{{\left[ \begin{matrix} {{V}_{1}} & 0 \\ 0 & {{V}_{2}} \\ \end{matrix} \right]}^{-1}}\left[ \begin{matrix} {{{\hat{\phi }}}_{1}} \\ {{{\hat{\phi }}}_{2}} \\ \end{matrix} \right] \\ & ={{\left( V_{1}^{-1}+V_{2}^{-1} \right)}^{-1}}\left( V_{1}^{-1}{{{\hat{\phi }}}_{1}}+V_{2}^{-1}{{{\hat{\phi }}}_{2}} \right) \end{align}$$

Help:Math

Expected value of SSH
Consider one-way MANOVA with $$G$$ groups, each with $$n_g$$ observations. Let $$N = \sum_{g=1}^G n_g\!$$ and let
 * $$ D = \begin{bmatrix} 1_{n_1} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & 1_{n_g}

\end{bmatrix} $$ be the design matrix.

Let $$Q$$ be the $$N \times N$$ residual projection matrix defined by
 * $$ Q = I - 1_N(1_N'1_N)^{-1}1_N'=I-\frac{1}{N}U$$

Analyzing SSH
We can find expressions for SSH in terms of the data and find expected values for SSH under a fixed effects or under a random effects model.

The following formula is used repeatedly to find the expected value of a quadratic form. If $$Y$$ is a random vector with $$\operatorname{E}(Y)=\mu$$ and $$\operatorname{Var}(Y) = \Psi\!$$, and $$Q\!$$ is symmetric, then


 * $$ \operatorname{E}( Y'QY) = \mu'Q\mu + \operatorname{tr}(Q \Psi) \!$$

We can model:


 * $$ \mathbf{Y} = D \mathbf{\mu} + \epsilon\! $$

where


 * $$\mu \sim N( 1\psi, \phi^2 I) \!$$

and


 * $$ \epsilon \sim N(0, \sigma^2 I )\!$$

and $$ \mu $$ is independent of $$ \epsilon $$.

Thus


 * $$ \operatorname{E}(Y) = 1 \psi $$ and $$ \operatorname{Var}(Y) = \phi^2 DD' + \sigma^2 I\! $$

Consequently



(\phi^2 DD' + \sigma^2 I )(I - \frac{1}{N}U) \right] $$ \phi^2 DD' - \frac{\phi^2}{N}DD'U + \sigma^2 Q \right] $$
 * $$ \operatorname{E}(SSTO)\!$$
 * $$ \operatorname{E}( Y'QY ) = \psi 1' Q 1 \psi + \operatorname{tr}\left[
 * $$ \operatorname{E}( Y'QY ) = \psi 1' Q 1 \psi + \operatorname{tr}\left[
 * $$0 + \operatorname{tr}\left[
 * $$0 + \operatorname{tr}\left[
 * $$0 + \operatorname{tr}\left[
 * $$0 + \operatorname{tr}\left[
 * $$\phi^2 N - \frac{\phi^2}{N}\operatorname{tr}(DD'U) + \sigma^2 \operatorname{tr}( Q ) $$
 * $$\phi^2 N - \frac{\phi^2}{N}\operatorname{tr}(1'DD'1) + \sigma^2 (N-1) $$
 * $$\phi^2 N - \frac{\phi^2}{N}\sum_{g=1}^G n_g^2 + \sigma^2 (N-1) $$
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (N-1) $$
 * }
 * $$\phi^2 N - \frac{\phi^2}{N}\operatorname{tr}(1'DD'1) + \sigma^2 (N-1) $$
 * $$\phi^2 N - \frac{\phi^2}{N}\sum_{g=1}^G n_g^2 + \sigma^2 (N-1) $$
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (N-1) $$
 * }
 * $$\phi^2 N - \frac{\phi^2}{N}\sum_{g=1}^G n_g^2 + \sigma^2 (N-1) $$
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (N-1) $$
 * }
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (N-1) $$
 * }
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (N-1) $$
 * }
 * }

where $$\tilde{n}= \sum_{g=1}^G \frac{n_g}{N} n_g$$ is the group-size weighted mean of group sizes. With equal groups $$ \tilde{n} = N / G $$ and


 * $$ E(SSTO) = \phi^2 N \frac{G-1}{G} + \sigma^2 (N-1) =\phi^2 (N-n) + \sigma^2 (N-1) $$

Thus




 * $$ E(SSH)\!$$
 * $$E (SSTO) - E(SSE)\!$$
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (N-1) - \sigma^2 (N-G)$$
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (G-1)$$
 * }
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (N-1) - \sigma^2 (N-G)$$
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (G-1)$$
 * }
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (G-1)$$
 * }
 * $$\phi^2 (N - \tilde{n}) + \sigma^2 (G-1)$$
 * }

Multivariate response
If we are sampling from a p-variate distribution in which


 * $$ \mathbf{Y}_{ig} \sim \mbox{i.i.d.} N(\mathbf{\mu}_g, \Sigma)$$

and


 * $$ \mathbf{\mu}_1,..., \mathbf{\mu}_G \sim \mbox{ i.i.d. } N(\mathbf{\psi}, \Phi), \mbox{ independently of } \mathbf{Y}_{ig} $$

then the analogous results are:


 * $$ E(SSE) = (N-G) \Sigma $$

and


 * $$ E(SSH) = (N - \overset{\sim}{n}) \Phi + (G-1) \Sigma $$

Note that


 * $$ Var( \bar{\mathbf{Y}}_{\cdot g} ) = \Phi + \frac{1}{n_g} \Sigma $$

and that the group-size weighted average of these variances is:


 * $$ \sum_{g=1}^G \frac{n_g}{N} Var( \bar{\mathbf{Y}}_{\cdot g} ) =

\sum_{g=1}^G \frac{n_g}{N} \left[ \Phi + \frac{1}{n_g} \Sigma \right] = \Phi + \frac{G}{N} \Sigma $$

The expectation of combinations of $$SSH$$ and $$SSE$$ of the form $$ k_H SSH + k_E SSE$$: