Skip to content

Snedecor’s F-distribution and Student’s t-distribution

Snedecor’s F-distribution

These two distributions are extremely important in statistics, as they underly the t-test and F-test. Here we simply define these distributions without touching on the role they play on statistics.

Definition: Snedecor’s $F(n,m)$-distributionĀ  of $(n,m)$ degrees of freedom is a probability density function given by
\begin{eqnarray}
F_{n,m}(x)=\frac {\Gamma(\frac{n+m}{2}) n^{\frac{n}{2}} m^{\frac{m}{2}} } {\Gamma(\frac{n}{2}) \Gamma(\frac{m}{2})}
\frac{x^{(n/2-1)}}{(m+n x)^{(n+m)/2}}
\end{eqnarray}
where $x \geqslant 0$.

Theorem: Let the random valiable $ V \sim \chi_n^2(v)$ be chi-square distributed with dof $n$, and let the random variable $ W \sim \chi_m^2(w)$ be also chi-square distributed with dof $m$. Then the random variable $F$
\begin{eqnarray}
F = \frac{\frac{V}{m}}{\frac{W}{n}}
\end{eqnarray}
is $F_{n,m}$ distributed with dof $n,m$.
\begin{eqnarray}
F \sim F_{n,m}(x)
\end{eqnarray}

Proof: Note that the random variables $V$ and $W$ are independent, hence
\begin{eqnarray}
P(V<V<v+dv,w<W<w+dw) &=& \chi_m^2(v)\chi_n^2(w)dvdw\\
&=&\left( \frac{v^{\frac{m}{2}-1} e^{-\frac{v}{2}}}{2^{\frac{m}{2}}\varGamma(\frac{m}{2})} \right)
\left(\frac{w^{\frac{n}{2}-1} e^{-\frac{w}{2}}}{2^{\frac{n}{2}}\varGamma(\frac{n}{2})}\right)dvdw \nonumber\\
&=& \frac{v^{\frac{m}{2}-1} w^{\frac{n}{2}-1} e^{-\frac{v+w}{2}}}{2^{\frac{m+n}{2}}\varGamma(\frac{n}{2})\varGamma(\frac{m}{2})} dvdw
\end{eqnarray}
Let us change the variables
\begin{eqnarray}
f&=&\frac{\frac{v}{m}}{\frac{w}{n}}\\
x&=&w
\end{eqnarray}
The jacobian is
\begin{eqnarray}
\left[
\begin{array}{cc}
\frac{\partial v}{\partial f} & \frac{\partial w}{\partial f}\\
\frac{\partial v}{\partial x} & \frac{\partial w}{\partial x}
\end{array}
\right] = \left[
\begin{array}{cc}
\frac{mx}{n} & 0\\
\frac{mf}{n} & 1
\end{array}
\right] = \frac{mx}{n}
\end{eqnarray}
Hence
\begin{eqnarray}
P(f<F<f+df,x<X<x+dx) &=& \chi_m^2\left(\frac{mfx}{n}\right)\chi_n^2(x) \frac{mx}{n}dfdx\\
&=&\frac{\left(\frac{mfx}{n}\right)^{\frac{m}{2}-1} x^{\frac{n}{2}-1} \mathrm{ exp}[{-\frac{x}{2}}(1+\frac{mf}{n})]}{2^{\frac{m+n}{2}}\varGamma(\frac{n}{2})\varGamma(\frac{m}{2})}\frac{mx}{n} dfdx \nonumber\\
&=&\frac{\left(\frac{m}{n}\right)^{\frac{m}{2}} f^{\frac{m}{2}-1} x^{\frac{m+n}{2}-1}
\mathrm{ exp}[{-\frac{x}{2}}(1+\frac{mf}{n})]}{2^{\frac{m+n}{2}}\varGamma(\frac{n}{2})\varGamma(\frac{m}{2})} dfdx \nonumber
\end{eqnarray}
To find the pdf of the random variable $F$, we must compute the marginal distribution
\begin{eqnarray}
F_{m,n}(f)df &=& P(f<F<f+df)\\
&=&\frac{\left(\frac{m}{n}\right)^{\frac{m}{2}} f^{\frac{m}{2}-1} }{2^{\frac{m+n}{2}}\varGamma(\frac{n}{2})\varGamma(\frac{m}{2})}df
\int_0^{\infty} x^{\frac{m+n}{2}-1} \mathrm{ exp}[{-\frac{x}{2}}(1+\frac{mf}{n})]dx \nonumber
\end{eqnarray}
Changing variables again
\begin{eqnarray}
q&=&\frac{x}{2}(1+\frac{mf}{n})\\
x&=&\frac{2q}{1+\frac{mf}{n} }\\
dx&=&\frac{2dq}{1+\frac{mf}{n} }
\end{eqnarray}
Substituting,
\begin{eqnarray}
&=& \frac{\left(\frac{m}{n}\right)^{\frac{m}{2}} f^{\frac{m}{2}-1} }{2^{\frac{m+n}{2}}\varGamma(\frac{n}{2})\varGamma(\frac{m}{2})}df
\int_0^{\infty} \left( \frac{2q}{1+\frac{mf}{n} }\right)^{\frac{m+n}{2}-1} e^{-q}\frac{2dq}{1+\frac{mf}{n} } \nonumber\\
&=& \frac{\left(\frac{m}{n}\right)^{\frac{m}{2}} f^{\frac{m}{2}-1} }{\varGamma(\frac{n}{2})\varGamma(\frac{m}{2})}df
\left( 1+\frac{mf}{n} \right)^{-\frac{m+n}{2}}\int_0^{\infty}q^{\frac{m+n}{2}-1} e^{-q}dq \nonumber
\end{eqnarray}
Recall the definition of gamma function:
\begin{eqnarray}
\Gamma(n) = \int_0^{\infty}x^{n-1}e^{-x}dx
\end{eqnarray}
Therefore (cancelling $df$’s on both sides),we arrive at
\begin{eqnarray}
F_{m,n}(f) = \frac{\left(\frac{m}{n}\right)^{\frac{m}{2}} f^{\frac{m}{2}-1} \varGamma(\frac{m+n}{2}) }{\varGamma(\frac{n}{2})\varGamma(\frac{m}{2})}
\left( 1+\frac{mf}{n} \right)^{-\frac{m+n}{2}}
\end{eqnarray}
which is the desired result..

Student’s t-distribution

Definition: Student’s t-distribution (or simply t-distribution) of $n$ degrees of freedom is a probability density function given by
\begin{eqnarray}
t_{n}(x)=\frac{\Gamma(\frac{n+1}{2})}{\sqrt{n \pi} \Gamma(\frac{n}{2})}\left( 1+ \frac{x^2}{n}\right)^{-\frac{n+1}{2}}
\end{eqnarray}

Theorem: Consider two independent random variables $Z$ and $W$. Let $Z \sim \mathrm{N}(0,1)$ be standard normal, and letĀ  $ W \sim \chi^2(n)$ be chi-square with $n$ degrees of freedom. Define the random variable $T$ as
\begin{eqnarray}
T = \frac{Z}{\sqrt{\frac{W}{n}}}
\end{eqnarray}
Then $T$ will have a t-distribution with dof $n$
\begin{eqnarray}
T \sim t_{n}(x)
\end{eqnarray}

We will give two different proofs of this theorem. The first proof will be via direct computation. The second proof will establish a connection between Snedecor-F distribution and student’s-t distribution.

PROOF 1: Note that the random variables $Z$ and $W$ are independent, hence
\begin{eqnarray}
P(z<Z<z+dz,w<W<w+dw) &=& N(z)\chi_n^2(w)dzdw\\
&=&\frac{1}{\sqrt{2\pi}}e^{-\frac{z^2}{2} } \frac{w^{\frac{n}{2}-1} e^{-\frac{w}{2}}}{2^{\frac{n}{2}}\varGamma(\frac{n}{2})}dzdw \nonumber
\end{eqnarray}
Let us change the variables
\begin{eqnarray}
z&=&t\sqrt{\frac{w}{n}}\\
w&=&w
\end{eqnarray}
The jacobian is
\begin{eqnarray}
\left[
\begin{array}{cc}
\frac{\partial z}{\partial t} & \frac{\partial w}{\partial t}\\
\frac{\partial z}{\partial w} & \frac{\partial w}{\partial w}
\end{array}
\right] = \left[
\begin{array}{cc}
\sqrt{\frac{w}{n}} & 0\\
t\sqrt{\frac{1}{wn}} & 1
\end{array}
\right] = \sqrt{\frac{w}{n}}
\end{eqnarray}
Hence
\begin{eqnarray}
P(t<T<t+dt,w<W<w+dw) &=& N\left(t\sqrt{\frac{w}{n}}\right)\chi_n^2(w)\sqrt{\frac{w}{n}}dtdw\\
&=&\frac{1}{\sqrt{2\pi}}e^{-\frac{t^2w}{2n} } \frac{w^{\frac{n}{2}-1} e^{-\frac{w}{2}}}{2^{\frac{n}{2}}\varGamma(\frac{n}{2})}\sqrt{\frac{w}{n}}dtdw \nonumber\\
&=& \frac{1}{\sqrt{2 \pi n} 2^{n/2}\Gamma\left( \frac{n}{2} \right)} w^{\frac{n-1}{2}} \mathrm{exp}\left(-\frac{wt^2}{2n} -\frac{w}{2} \right)dtdw \nonumber
\end{eqnarray}
Hence the pdf $f(t,w)$ for the random vatiables $T$ and $W$ is
\begin{eqnarray}
f(t,w)= \frac{1}{\sqrt{2 \pi n} 2^{n/2}\Gamma\left( \frac{n}{2} \right)} w^{\frac{n-1}{2}} \mathrm{exp}\left(-\frac{wt^2}{2n} -\frac{w}{2} \right)
\end{eqnarray}
To find the pdf of the tandom variable $T$, we have to find the marginal distribution of $f(t,w)$ over $w$
\begin{eqnarray}
f(t,w)= \frac{1}{\sqrt{2 \pi n} 2^{n/2}\Gamma\left( \frac{n}{2} \right)}\int_0^{\infty} w^{\frac{n-1}{2}} \mathrm{exp}\left(-\frac{wt^2}{2n} -\frac{w}{2} \right)dw
\end{eqnarray}
Now, let us do a change of variation:
\begin{eqnarray}
x&=&\left(\frac{w}{2}+\frac{wt^2}{2n} \right)\\
w &=& \left(\frac{1}{2}+\frac{t^2}{2n} \right)^{-1}x \\
dw &=& \left(\frac{1}{2}+\frac{t^2}{2n} \right)^{-1}dx
\end{eqnarray}
Substituting
\begin{eqnarray}
&=&\frac{1}{\sqrt{2 \pi n} 2^{n/2}\Gamma\left( \frac{n}{2} \right)} \int_0^{\infty} \left[ \left(\frac{1}{2}+\frac{t^2}{2n} \right)^{-1}x \right]^{\frac{n-1}{2}}
e^{-x}\left(\frac{1}{2}+\frac{t^2}{2n} \right)^{-1}dx\\
&=&\frac{1}{\sqrt{2 \pi n} 2^{n/2}\Gamma\left( \frac{n}{2} \right)} \left(\frac{1}{2}+\frac{t^2}{2n} \right)^{-(n+1)/2} \int_0^{\infty}x^{(n-1)/2}e^{-x}dx\\
&=&\frac{2^{(n+1)/2}}{\sqrt{2 \pi n} 2^{n/2}\Gamma\left( \frac{n}{2} \right)} \left(1+\frac{t^2}{n} \right)^{-(n+1)/2} \int_0^{\infty} x^{(n-1)/2}e^{-x}dx\\
&=&\frac{1}{\sqrt{\pi n} \Gamma\left( \frac{n}{2} \right)} \left(1+\frac{t^2}{n} \right)^{-(n+1)/2} \int_0^{\infty} x^{(n-1)/2}e^{-x}dx
\end{eqnarray}
Recall the definition of the gamma function:
\begin{eqnarray}
\Gamma(n) = \int_0^{\infty}x^{n-1}e^{-x}dx
\end{eqnarray}
Therefore
\begin{eqnarray}
\int_0^{\infty} x^{(n-1)/2}e^{-x}dx = \int_0^{\infty} x^{(n/2+1/2-1}e^{-x}dx = \Gamma\left(\frac{n+1}{2} \right)
\end{eqnarray}
Substituting again, we get the $t(n)$ distribution
\begin{eqnarray}
&=&\frac{\Gamma\left(\frac{n+1}{2} \right)}{\sqrt{\pi n} \Gamma\left( \frac{n}{2} \right)} \left(1+\frac{t^2}{n} \right)^{-(n+1)/2}
\end{eqnarray}

PROOF 2: Consider the random variable $R$ which is $t_n$-distributed, i.e., $R \sim t_n$. Recalling the definition of t-distribution, we can write $R$ as
\begin{eqnarray}
R = \frac{Z}{\sqrt{\frac{W}{n}}}
\end{eqnarray}
where the random varizbles $Z$ and $W$ are independent, $Z \sim N(0,1)$ and $W\sim \chi _n^2$. Now, let us consider the random variable $R^2$
\begin{eqnarray}
R^2 = \frac{Z^2}{\frac{W}{n}}
\end{eqnarray}
As $Z \sim N(0,1)$, $Z^2 \sim \chi^2_1$. Also, $W\sim \chi _n^2$. Recalling the definition of F-distribution, we can conclude that
\begin{eqnarray}
R^2 \sim F_{1,n}
\end{eqnarray}
Or, expressed in words: If a random variable is $t_n$ distributed, its square is $F_{1,n}$ distributed.

If we define random variable $P$ as $P=R^2$ we have the pdf’s
\begin{eqnarray}
[t_n(-r)+t_n(r)]dr &=& F_{1,n}(p)dp \\
&=& F_{1,n}(r^2)2rdr
\end{eqnarray}
As $t_n(r)$ is symmetric, ie, $t_n(r)=t_n(-r)$,
\begin{eqnarray}
2t_n(r) = F_{1,n}(r^2)2r
\end{eqnarray}
Using (xxx)
\begin{eqnarray}
t_n(r) = \frac{\left(\frac{1}{n}\right)^{\frac{1}{2}} r^{-1} \varGamma(\frac{1+n}{2}) }{\varGamma(\frac{n}{2})\varGamma(\frac{1}{2})}
\left( 1+\frac{r^2}{n} \right)^{-\frac{1+n}{2}} r
\end{eqnarray}
Recall that
\begin{eqnarray}
\varGamma\left( \frac{1}{2} \right) = \sqrt{\pi}
\end{eqnarray}
This gives us the result.