User:Yen851115/CHEME5740(II)

User:Yen851115/CHEME5740(II)

Introductory Paragraph

 * In probability theory and statistics, the exponential distribution is the probability distribution of the time between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of being memoryless. In addition to being used for the analysis of Poisson point processes it is found in various other contexts.

>>> ''' The sums of independent random variables with exponential distirbution shows gamma distribution. '''

>>> (insert a paragraph)''' To have a brief conclusion and take one-dimensional process as instance, a sequence of random variables $$X_1$$, $$X_2$$, $$X_3$$,... showing Poisson process with intensity λ (each point is uniformally distributed on the total interval) will have a property that each of the interarrival times $$X_1$$, $$X_2 - X_1$$, $$X_3 - X_2$$, ... is a independent variable with exponential distribution. Additionally, when it comes to the toatal process with i incidents and intensity λ, the random variable $$X_i$$ has gamma distribution since it is the sum of exponentially distributed variables. '''

Memmorylessness

 * An exponentially distributed random variable T obeys the relation

>>> An exponentially distributed random variable T obeys the memeoryless relation


 * The exponential distribution and the geometric distribution are the only memoryless probability distributions.

>>> (add citation) The exponential distribution and the geometric distribution are the only memoryless probability distributions .

Related distribution

 * Exponential distribution is closed under scaling by a positive factor. If X ~ Exp(λ) then kX ~ Exp(λ/k).

>>> (delete it, i don't think this is right.)


 * 'If X ~ Exp(λ) then X ~ Gamma(1, λ−1) (in (k, θ) parametrization) or Gamma(1, λ) (in (α, β'') parametrization).
 * 'If Xi ~ Exp(λ) then the sum $ X_1 + \cdots + X_k = \sum_i X_i \sim $ Erlang(k, λ)  which is just a Gamma(k, λ−1) (in (k, θ) parametrization) or Gamma(k, λ) (in (α,β'') parametrization) with an integer shape parameter k.

>>> (add citation of the textbook)


 * 'If X ~ Pareto(1, λ) then log(X'') ~ Exp(λ).
 * 'If X ~ Exp(λ) then keX ~ Pareto(k'', λ).
 * 'If X ~ Exp(λ) and Y ~ Erlang(n'', λ) then:
 * 'If X'' ~ Exp(λ) and $$Y \sim \Gamma(n,\tfrac{1}{\lambda})$$ then $$\tfrac{X}{Y}+1 \sim \operatorname{Pareto}(1,n)$$

>>> (put them together ; 3 and 4 are basically the same thing, we can get rid of one) Jung-Un's edit

= Properties = Mean, variance, moments and median

The mean or expected value of an exponentially distributed random variable X with rate parameter λ is given by


 * $$\operatorname{E}[X] = \int\limits_{0}^{\infty} t{\lambda}\exp{(-{\lambda}t)dt}

= \frac{1}{\lambda}.$$

In light of the examples given above, this makes sense: if you receive phone calls at an average rate of 2 per hour, then you can expect to wait half an hour for every call.

The variance of X is given by


 * $$\operatorname{Var}[X] = \int\limits_{0}^{\infty} t^2{\lambda}\exp{(-{\lambda}t)dt} - \operatorname{E}[X]^2 = \frac{1}{\lambda^2},$$

= Related distribution =


 * Exponential distribution is in continuous time while the geometeric distribution is a discrete trials. When a positive integer random variable X follows the geometric distribution with parameter $$p \in [0,\infty)$$,

$$P(X=n) = p(1-p)^{n-1}, {\forall}n\geq 1$$, and equivalently, $$P(X>n) = (1-p)^{n-1}, {\forall}n \in \mathbb{N}. $$ Thus, if $$p$$ represents the probability of winning the lottery, X gives the distribution of the number of attemps took to win. When $$p$$ is low enough, the geometric distribution is close to the exponential distribution. Intuitively, this can be understood as infinite trials of lottery that has extremely small winning probability. Formally, lets consider $$X^{(\tau)}$$, a geometric random variable with parameter $$p^{(\tau)} = \lambda{\tau}$$ with $$\lambda$$ as a fixed positive parameter, and $$\tau$$ is a sufficiently small time step. When the $$\tau$$ goes to zero, the geometeric distribution converges to the exponential distribution.

$$P(X^{(\tau)}>{t \over \tau}) = (1-p^{(\tau)})^{[t/\tau]} \rightarrow e^{-\lambda{t}}, {\forall}t\in \mathbb{R}_+.$$

>>> included the integral calculation formula for mean and variance

>>> Described the relationship between geometric distribution and exponential distribution