Skip to content

Moment generating functions

wikipedia example

If we have a pdf $f_X(x)$, its moment generating function is defined as
\begin{eqnarray}
M_X(t)=E[e^{tX}] = \int_{-\infty}^{\infty} e^{tx} f_X(x) dx
\end{eqnarray}
On the other hand, if we have a pmf $f_X(n)$, its mass generating function is defined as
\begin{eqnarray}
M_X(t) = E(e^{tX})=\sum_x e^{tx} p(x)
\end{eqnarray}
Note that in pmf case the values that the random variables could take are limited to nonnegative integers. We will see examples of calculation later in these notes.
Moment generating functions have two main uses:

  1. Finding the pdf/pmf of the sum of random variables (especially useful in the iid case).
  2. Calculating the moments of random variables.

Calculating the pdf of weighted sum of independent random variables

This will save us from the tedious convolution integral.

Theorem: Let us have $n$  independent continuous random variables $X_1, X_2, \ldots, X_n$ with pdf $f_{X_i}(x)$ and moment generating function
$M_{X_i}(t)$. Then, the pdf of the linear combinations of these random variables, $Y=a_1 X_1 +\ldots+a_n X_n$ has moment generating function
\begin{eqnarray}
M_Y(t)=\prod_{i=1}^n M_{X_i}(a_i t)
\end{eqnarray}
Proof:
\begin{eqnarray}
M_Y(t)=E[e^{tY}] &=& E[e^{t(a_1 X_1 +\ldots+a_n X_n)}] \\
&=& E[e^{ta_1 X_1} e^{ta_2 X_2} \ldots e^{ta_n X_n}]
\end{eqnarray}
As $X_i$’s are independent
\begin{eqnarray}
E[e^{ta_1 X_1} e^{ta_2 X_2} \ldots e^{ta_n X_n}]&=&E[e^{ta_1 X_1}] E[e^{ta_2 X_2}] \ldots E[e^{ta_n X_n}] \nonumber\\
&=&\prod_{i=1}^n M_{X_i}(a_i t)
\end{eqnarray}

Calculating the moments of pdf’s

It is easy to see that
\begin{eqnarray}
E(X^n)=\left[ \frac{d^n M_X(t)}{dt^n}\right]_{s=0}
\end{eqnarray}
For example, variance can be written as
\begin{eqnarray}
\sigma_X^2=E(X^2)-E(X)^2 = \left[ \frac{d^2 M_X(t)}{dt^2}\right]_{s=0} – \left[ \frac{d M_X(t)}{dt}\right]_{s=0}^2
\end{eqnarray}
Examples will be provided in the coming chapters.