Skip to content

Random Processes

Let us remember the basic setup of the probability theory:

“…….A random experiment is an action for which all possible outcomes can be listed, but which outcome will occur cannot be predicted with certainty. A random experiment may be performed many times, each time yielding a different outcome……”

Random processes are a special case of this definition where the outcome is not a number but signal, ie, a function of time.

Example: Assume that we are managing a telephone network which carries millions of speech signals at every given instant of time. The experiment is to arbitrarily pick one of these calls. Because we pick randomly, each time we will pick a different call. The outcome of the experiment will be a speech signal $x(t)$.

Notation: Random processes are denoted by $x(t,\xi)$. This is also called “an ensemble of time functions, and $\xi$ is the ensemble index.

Defining a pdf for a random process

Power spectrum: Fourier transform of the auto or cross correlation function of a wss process. Psd is similar to pdf: how much energy ies between w and w+dw

whiteness characterize the joint properties of two random variables at two different instants of time.. It doesnt say anything about the pdf’s of individual variables, ie, whether they are gaussian or uniform…

ensemble average means that you have to carry out a large number of experiments, get a large number of outcomes, and average them. Fınding ensemble averages is very inconvenient. So if time averages=ensemble averages the system s ergodic.  Note that both time and ensemble averages are random variables. Note that ergodicity is only defined for stationary processes.

all stationary processes need not be ergodic.

random signals can not be described like deterministic signals decause they dont have values. we characterize some property of the signal like autocorrelation or psd. That is called a signal model.