Studentized range distribution

In probability and statistics, studentized range distribution is the continuous probability distribution of the studentized range of an i.i.d. sample from a normally distributed population.

Suppose that we take a sample of size n from each of k populations with the same normal distribution N(μ, σ2) and suppose that $$\bar{y}_{\min}$$ is the smallest of these sample means and $$\bar{y}_{\max}$$ is the largest of these sample means, and suppose s² is the pooled sample variance from these samples. Then the following statistic has a Studentized range distribution.


 * $$q = \frac{\overline{y}_{\max} - \overline{y}_{\min}}{s/\sqrt{n\,}}$$

Probability density function
Differentiating the cumulative distribution function with respect to q gives the probability density function.
 * $$f_\text{R}(q;k,\nu) = \frac{\sqrt{2\pi\,}\,k\,(k-1)\,\nu^{\nu/2}}{\Gamma(\nu /2)\,2^{\left(\nu/2-1\right)}}\int_0^\infty s^\nu \, \varphi(\sqrt{\nu\,} \,s)\,\left[\int_{-\infty}^\infty \varphi(z+q\,s)\,\varphi(z)\, \left[\Phi(z+q\,s)-\Phi(z)\right]^{k-2} \, \mathrm{d}z\right] \, \mathrm{d}s$$

Note that in the outer part of the integral, the equation
 * $$\varphi(\sqrt{\nu\,}\,s) \, \sqrt{2\pi\,} = e^{-\left(\nu\, s^2/2\right)} $$

was used to replace an exponential factor.

Cumulative distribution function
The cumulative distribution function is given by
 * $$F_\text{R}(q;k,\nu) = \frac{\sqrt{2\pi\,}\,k\,\nu^{\nu/2}}{\,\Gamma(\nu/2)\,2^{(\nu/2-1)}\,} \int_0^\infty s^{\nu-1} \varphi(\sqrt{\nu\,}\,s) \left[\int_{-\infty}^\infty \varphi(z) \left[\Phi(z+q\,s)-\Phi(z)\right]^{k-1} \, \mathrm{d}z \right] \, \mathrm{d}s$$

Special cases
If k is 2 or 3, the studentized range probability distribution function can be directly evaluated, where $$\varphi(z)$$ is the standard normal probability density function and $$\Phi(z)$$ is the standard normal cumulative distribution function.
 * $$f_R(q;k=2) = \sqrt{2\,}\,\varphi\left(\,q/\sqrt{2\,}\right)$$


 * $$f_R(q;k=3) = 6 \sqrt{2\,}\, \varphi\left(\,q/\sqrt{2\,}\right)\left[\Phi\left( q / \sqrt{6\,} \right)-\tfrac{1}{2} \right]$$

When the degrees of freedom approaches infinity the studentized range cumulative distribution can be calculated for any k using the standard normal distribution.
 * $$F_R(q;k) = k\, \int_{-\infty}^\infty \varphi(z)\,\Bigl[\Phi(z+q)-\Phi(z)\Bigr]^{k-1} \, \mathrm{d}z = k\, \int_{-\infty}^\infty \,\Bigl[\Phi(z+q)-\Phi(z)\Bigr]^{k-1} \, \mathrm{d}\Phi(z)$$

Applications
Critical values of the studentized range distribution are used in Tukey's range test.

The studentized range is used to calculate significance levels for results obtained by data mining, where one selectively seeks extreme differences in sample data, rather than only sampling randomly.

The Studentized range distribution has applications to hypothesis testing and multiple comparisons procedures. For example, Tukey's range test and Duncan's new multiple range test (MRT), in which the sample x1, ..., xn is a sample of means and q is the basic test-statistic, can be used as post-hoc analysis to test between which two groups means there is a significant difference (pairwise comparisons) after rejecting the null hypothesis that all groups are from the same population (i.e. all means are equal) by the standard analysis of variance.

Related distributions
When only the equality of the two groups means is in question (i.e. whether μ1 = μ2), the studentized range distribution is similar to the Student's t distribution, differing only in that the first takes into account the number of means under consideration, and the critical value is adjusted accordingly. The more means under consideration, the larger the critical value is. This makes sense since the more means there are, the greater the probability that at least some differences between pairs of means will be significantly large due to chance alone.

Derivation
The studentized range distribution function arises from re-scaling the sample range R by the sample standard deviation s, since the studentized range is customarily tabulated in units of standard deviations, with the variable. The derivation begins with a perfectly general form of the distribution function of the sample range, which applies to any sample data distribution.

In order to obtain the distribution in terms of the "studentized" range q, we will change variable from R to s and q. Assuming the sample data is normally distributed, the standard deviation s will be $R/s$ distributed. By further integrating over s we can remove s as a parameter and obtain the re-scaled distribution in terms of q alone.

General form
For any probability density function f$&chi;$, the range probability density f$X$ is:
 * $$f_R(r;k) = k\,(k-1)\int_{-\infty}^\infty f_X\left(t+\tfrac{1}{2} r\right)f_X \left(t - \tfrac{1}{2} r\right) \left[\int_{t-\tfrac{1}{2}r}^{t+\tfrac{1}{2} r} f_X(x) \, \mathrm{d}x\right]^{k-2} \, \mathrm{d}\,t$$

What this means is that we are adding up the probabilities that, given k draws from a distribution, two of them differ by r, and the remaining k &minus; 2 draws all fall between the two extreme values. If we change variables to u where $$u=t-\tfrac{1}{2} r$$ is the low-end of the range, and define F$R$ as the cumulative distribution function of f$X$, then the equation can be simplified:
 * $$f_R(r;k) = k\,(k-1)\int_{-\infty}^\infty f_X(u+r)\, f_X(u)\, \left[\, F_X(u+r)-F_X(u)\, \right]^{k-2} \, \mathrm{d}\,u$$

We introduce a similar integral, and notice that differentiating under the integral-sign gives

\begin{align} \frac{\partial}{\partial r} & \left[ k\,\int_{-\infty}^\infty f_X(u)\, \Bigl[\, F_X(u+r)-F_X(u)\, \Bigr]^{k-1} \, \mathrm{d}\,u \right] \\[5pt] = {} & k\,(k-1)\int_{-\infty}^\infty f_X(u+r)\, f_X(u)\, \Bigl[\, F_X(u+r)-F_X(u)\, \Bigr]^{k-2} \, \mathrm{d}\,u \end{align} $$

which recovers the integral above, so that last relation confirms

\begin{align} F_R(r;k) & = k \int_{-\infty}^\infty f_X(u) \Bigl[\, F_X(u+r)-F_X(u)\, \Bigr]^{k-1} \, \mathrm{d}\,u \\ & = k \int_{-\infty}^\infty \Bigl[\, F_X(u+r)-F_X(u)\, \Bigr]^{k-1} \, \mathrm{d}\,F_X(u) \end{align} $$

because for any continuous cdf
 * $$\frac{\partial F_R(r;k)}{\partial r} = f_R(r;k)$$

Special form for normal data
The range distribution is most often used for confidence intervals around sample averages, which are asymptotically normally distributed by the central limit theorem.

In order to create the studentized range distribution for normal data, we first switch from the generic f$X$ and F$X$ to the distribution functions &phi; and &Phi; for the standard normal distribution, and change the variable r to s·q, where q is a fixed factor that re-scales r by scaling factor s:


 * $$f_R(q;k) = s\,k\,(k-1)\int_{-\infty}^\infty \varphi(u+sq) \varphi(u)\, \left[\, \Phi(u+sq) - \Phi(u) \right]^{k-2} \, \mathrm{d}u$$

Choose the scaling factor s to be the sample standard deviation, so that q becomes the number of standard deviations wide that the range is. For normal data s is chi distributed and the distribution function f$X$ of the chi distribution is given by:



f_S(s;\nu)\,\mathrm{d}s = \begin{cases} \dfrac{\nu^{\nu/2}\,s^{\nu-1} e^{-\nu\,s^2/2}\,}{2^{\left(\nu/2 - 1\right)} \Gamma(\nu/2)} \, \mathrm{d}s & \text{for }\, 0 < s < \infty, \\[4pt] 0 & \text{otherwise}. \end{cases} $$

Multiplying the distributions f$&chi;$ and f$&chi;$ and integrating to remove the dependence on the standard deviation s gives the studentized range distribution function for normal data:
 * $$f_R(q;k,\nu) = \frac{\nu^{\nu/2}\,k\,(k-1)}{2^{\left( \nu/2 - 1\right)} \Gamma(\nu/2)} \int_0^\infty s^\nu e^{-\nu s^2/2} \int_{-\infty}^\infty \varphi(u+sq)\, \varphi(u)\, \left[\, \Phi(u+sq) - \Phi(u) \right]^{k-2} \, \mathrm{d}u \,\mathrm{d}s$$

where
 * q is the width of the data range measured in standard deviations,
 * $2$ is the number of degrees of freedom for determining the sample standard deviation, and
 * k is the number of separate averages that form the points within the range.

The equation for the pdf shown in the sections above comes from using
 * $$e^{-\nu\,s^2/2} = \sqrt{2\pi\,}\,\varphi(\sqrt{\nu\,}\,s)$$

to replace the exponential expression in the outer integral.