Stochastic equicontinuity

In estimation theory in statistics, stochastic equicontinuity is a property of estimators (estimation procedures) that is useful in dealing with their asymptotic behaviour as the amount of data increases. It is a version of equicontinuity used in the context of functions of random variables: that is, random functions. The property relates to the rate of convergence of sequences of random variables and requires that this rate is essentially the same within a region of the parameter space being considered.

For instance, stochastic equicontinuity, along with other conditions, can be used to show uniform weak convergence, which can be used to prove the convergence of extremum estimators.

Definition
Let $$ \{ H_n(\theta): n \geq 1 \} $$ be a family of random functions defined from $$\Theta \rightarrow \reals$$, where $$\Theta$$ is any normed metric space. Here $$\{ H_n(\theta) \}$$ might represent a sequence of estimators applied to datasets of size n, given that the data arises from a population for which the parameter indexing the statistical model for the data is &theta;. The randomness of the functions arises from the data generating process under which a set of observed data is considered to be a realisation of a probabilistic or statistical model. However, in $$\{ H_n(\theta) \}$$, &theta; relates to the model currently being postulated or fitted rather than to an underlying model which is supposed to represent the mechanism generating the data. Then $$\{ H_n \}$$ is stochastically equicontinuous if, for every $$ \epsilon > 0 $$ and $$ \eta > 0$$, there is a $$\delta > 0 $$ such that:


 * $$ \limsup_{n \rightarrow \infty} \Pr\left( \sup_{\theta \in \Theta} \sup_{\theta' \in B(\theta, \delta)} |H_n(\theta') - H_n(\theta)| > \epsilon \right) < \eta .$$

Here B(&theta;, &delta;) represents a ball in the parameter space, centred at &theta; and whose radius depends on &delta;.