Skip to content

More complicated distributions

Interarrival times for Poisson distribution: Exponential distribution

\begin{eqnarray}
f(x) = \lambda e^{-\lambda x}
\end{eqnarray}
Its moment is
\begin{eqnarray}
M(s) = \frac{1}{1-\frac{s}{\lambda}}
\end{eqnarray}

Higher order Interarrival times for Poisson distribution: Erlang distribution

\begin{eqnarray}
f(x) = \frac{\lambda^r x^{r-1} e^{-\lambda x}}{(r-1)!}
\end{eqnarray}
The moment is
\begin{eqnarray}
M(s) = \left[\frac{1}{1-\frac{s}{\lambda}}\right]^r
\end{eqnarray}

Gamma Distribution

Gamma distribution is obtained by a subtle change in Erlang distribution. The gamma function is defined as
\begin{eqnarray}
\varGamma(t) = \int_0^{\infty} z^{t-1}e^{-z} dz
\end{eqnarray}
Using integration by parts, we can show that
\begin{eqnarray}
\varGamma(t)=(t-1)\varGamma(t-1)
\end{eqnarray}
Also, it is easy to show that $\varGamma(1)=1$, and it is not easy to show that $\varGamma(\frac{1}{2})=\sqrt{\pi}$.
Therefore, if $k$ is an integer, we can write
\begin{eqnarray}
\varGamma(k)=(k-1)!
\end{eqnarray}
This allows for a generalization of erlang distribution as:
\begin{eqnarray}
f(x) = \frac{\lambda^r x^{r-1} e^{-\lambda x}}{(r-1)!} = \frac{\lambda^r x^{r-1} e^{-\lambda x}}{\varGamma(r)}
\end{eqnarray}
Note that $f(x)$ integrates to 1, hence we still have a pdf.\\
As $\varGamma(r)$ is well-defined for noninteger $r$, what we have done amounts to extending Erlang distribution to noninteger $r$. In other words,
Erlang distribution is a special case of gamma distribution (and exponential distribution is a special case of erlang distribution). \\
Moment generating function of the gamma distribution is the same with Erlang.

$\chi^2$ distribution

A special case of gamma distribution with $r=\frac{\alpha}{2}, \alpha \in \mathbb{I}$ and $\lambda=\frac{1}{2}$ is known as the $\chi^2$-distribution
with $\alpha$ degree of freedom.
\begin{eqnarray}
f(x) = \frac{x^{\frac{\alpha}{2}-1} e^{-\frac{x}{2}}}{2^{\frac{\alpha}{2}}\varGamma(\frac{\alpha}{2})}
\end{eqnarray}
Its moment generating function is
\begin{eqnarray}
M(t)=\frac{1}{(1-2t)^{\frac{\alpha}{2}}}
\end{eqnarray}
which can be found from the moment generating function of gamma distribution
with $r=\frac{\alpha}{2},\alpha \in \mathbb{I}$ and $\lambda=\frac{1}{2}$. \\
We can think of erlang distribution as the number of arrival of buses in a given amount of time.. Gamma distribution can be thought as
weight of rainfall in a given amount of time.

Normal=Gaussian Distribution

\begin{eqnarray}
f_X(x)=\frac{1}{\sqrt{2\pi} \sigma}e^{-\frac{1}{2} \left[\frac{x-\mu}{\sigma}\right]^2}
\end{eqnarray}
Standard normal distribution is when $\mu=0, \sigma=1$.\\
To derive standard normal table, note that the cdf of gaussian is:
\begin{eqnarray}
F_X(x) = \int_{-\infty}^x \frac{1}{\sqrt{2\pi} \sigma}e^{-\frac{1}{2} \left[\frac{t-\mu}{\sigma}\right]^2}dt
\end{eqnarray}
If we do a change of variables $z=\frac{t-\mu}{\sigma} $, $dz=\frac{dt}{\sigma} \Rightarrow dt = \sigma dz$, we get
\begin{eqnarray}
F_X(x) = \int_{-\infty}^{\frac{x-\mu}{\sigma}} \frac{1}{\sqrt{2\pi} }e^{-\frac{z^2}{2} }dt
\end{eqnarray}
This means that it is possible to obtain the cdf of any normal distribution from the cdf of standard normal distribution. Hence the standard
normal table. \\
Moment generating function of gaussian is:
\begin{eqnarray}
e^{\mu t + \frac{\sigma^2 t^2}{2}}
\end{eqnarray}
Generally, stock price changes, ie,
\begin{eqnarray}
P_{n+1}-P_n
\end{eqnarray}
is not normally distributed (if this would be the case with $\sim N(0,1)$, $P_n$ would be normally distributed with $\sim(0,\sqrt{n}$).
But relative stock changes, ie,
\begin{eqnarray}
\frac{P_{n+1}-P_n}{P_n}
\end{eqnarray}
are normally distributed. In this case, $P_n$ will be log-normal distributed. Log-normal distribution is also a memeber of exponential family.
It does not have a moment generating function.

Square of a standard normal random variable is $\chi^2(1)$ distributed

Theorem: If $X$ and $Y$ are random variables and $Y=X^2$, and if $X$ is normally distributed, then $Y$ is $\chi^2(1)$-distributed.\\
Proof: We will use the machinery developed in the section for the function of a random variable.
Note that for Gaussian the distribution is $-\infty\leq X \leq \infty$, and $Y=X^2$ in yhis range is not 1-1. Hence the cdf is
\begin{eqnarray}
F_Y(y) = \int_{-\sqrt(y)}^{\sqrt(y)} \frac{1}{\sqrt{2\pi}}e^{-\frac{X^2}{2}}dx
\end{eqnarray}
and the corresponding pdf is, by leibnitz rule
\begin{eqnarray}
f_Y(y)&=& \frac{d}{dy} F_Y(y) = \frac{d}{dy} \int_{-\sqrt{y}}^{\sqrt{y}} \frac{1}{\sqrt{2\pi}}e^{-\frac{X^2}{2}}dx \\
&=& \frac{1}{\sqrt{2\pi}}e^{-\frac{y}{2}} \frac{1}{2 \sqrt{y}}-\left[\frac{1}{\sqrt{2\pi}}e^{-\frac{y}{2}} \left(-\frac{1}{2 \sqrt{y}} \right) \right] \\
&=& \frac{1}{\sqrt{2\pi y}}e^{-\frac{y}{2}}
\end{eqnarray}
Rearranging the terms, and remembering that $\varGamma(\frac{1}{2})=\sqrt{2\pi}$, we arrive at
\begin{eqnarray}
f_Y(y)=\frac{1}{\varGamma(\frac{1}{2}) 2^{\frac{1}{2}} }e^{-\frac{y}{2}} y^{-\frac{1}{2}}
\end{eqnarray}
which is indeed a chi-square distribution with 1 degree of freedom. Its moment generating function is
\begin{eqnarray}
M_Y(t)=\frac{1}{(1-2t)^{\frac{1}{2}}}
\end{eqnarray}

Sum of Squares of $n$ independent standard normal random variables is $\chi^2(n)$ distributed

Theorem: If the standard nomal random variables $z_1, z_2, \ldots, z_n$ are independent, then the random variable
\begin{eqnarray}
Y=z_1^2+z_2^2+\ldots+z_n^2
\end{eqnarray}
is $\chi^2(n)$-distributed.\\
Proof: Note that $Y$ is the sum of $n$ independent random variables, all of which are $\chi^2(1)$ distributed. Hence the characteristic
function of $Y$ is the multiplication of $n$ characteristic function of $Y$ is the multiplication of $n$ characteristic functions of
$\chi^2(1)$
\begin{eqnarray}
M_Y(t)&=&\left[ \frac{1}{(1-2t)^{\frac{1}{2}}} \right]^n\\
&=&\frac{1}{(1-2t)^{\frac{n}{2}}}
\end{eqnarray}
which is the characteristic function of $\chi^2(n)$. Hence the random variable $Y$ is $\chi^2(n)$-distributed.

Linear combination of independent Gaussians

{\bf Theorem:} If $X_1,\ldots,X_n$ are mutually independent random variables with $\mu_i, \sigma_i$, then the linear combination
\begin{eqnarray}
Y=a_1 X_1 + \ldots + a_n X_n
\end{eqnarray}
follows the normal distribution
\begin{eqnarray}
N(\sum_{i=1}^n a_i \mu_i, \sum_{i=1}^n a_i^2 \sigma_i^2)
\end{eqnarray}
{\bf Proof:}
\begin{eqnarray}
M_Y(t)&=&\varPi M_{X_i}(a_i t) = \varPi e^{\mu_i (a_i t) + \frac{\sigma_i^2 (a_i t)^2}{2}}\\
&=& e^{t \sum \mu_i a_i + \frac{t^2}{2} \sum \sigma_i^2 a_i^2 }
\end{eqnarray}