Random Processand Noise


Random noise button

10.1.5 Gaussian Random Processes

Definition of a Random Process Assume the we have a random experiment with outcomes w belonging to the sample set S.To each w ∈ S, we assign a time function X(t,w), t ∈ I, where I is a time index set: discrete or continuous. X(t,w) is called a random process. If w is fixed, X(t,w) is a deterministic time function, and is called a realization, a sample path, or a. Random signals, also called stochastic signals, contain uncertainty in the parameters that describe them. Because of this uncertainty, mathematical functions cannot be used to precisely describe random signals. Instead, random signals are most often analyzed using statistical techniques that require the treatment of the random parameters of the signal with probability distributions or simple. 4-)X(t)= S(t)N(t) random process and S(t)= aCos( wt+) identified with.Where are a and wo constant, N(t) loose stationary noise process and EN (t) +0, N (t) 1 0, 8 0, +2n between while uniformly distrubuted. Noise is most meaningfulif it is put in perspectiveby comparingto signal strength. In X-ray radiography,the relevant comparison is to contrast. Artifacts It is important to distinguish random noise effects from image artifacts. Although artifacts can appear random, such as the.

Here, we will briefly introduce normal (Gaussian) random processes. We will discuss some examples of Gaussian processes in more detail later on.Many important practical random processes are subclasses of normal random processes.

First, let us remember a few facts about Gaussian random vectors. As we saw before, random variables $X_1$, $X_2$,.., $X_n$ are said to be jointly normal if, for all $a_1$,$a_2$,.., $a_n$ $in mathbb{R}$, the random variable

begin{align}%label{} a_1X_1+a_2X_2+..+a_nX_nend{align}is a normal random variable. Also, a random vectorbegin{equation}nonumber textbf{X} = begin{bmatrix} X_1 %[5pt] X_2 %[5pt] . [-10pt] . [-10pt] . [5pt] X_nend{bmatrix}end{equation}is said to be normal or Gaussian if the random variables $X_1$, $X_2$,.., $X_n$ are jointly normal. An important property of jointly normal random variables is that their joint PDF is completely determined by their mean and covariance matrices. More specifically, for a normal random vector X with mean $mathbf{m}$ and covariance matrix C, the PDF is given bybegin{align*} f_{mathbf{X}}(mathbf{x})=frac{1}{(2pi)^{frac{n}{2}} sqrt{dettextbf{C}}} exp left{-frac{1}{2} (textbf{x}-textbf{m})^T mathbf{C}^{-1}(textbf{x}-textbf{m}) right}.end{align*}Now, let us define Gaussian random processes.
A random process $big{X(t), t in J big}$ is said to be a Gaussian (normal) random process if, for allbegin{align}%label{} & t_1,t_2, dots, t_n in J,end{align}the random variables $X(t_1)$, $X(t_2)$,.., $X(t_n)$ are jointly normal.

Example
Let $X(t)$ be a zero-mean WSS Gaussian process with $R_X(tau)=e^{-tau^2}$, for all $tau in mathbb{R}$.
  1. Find $Pbig(X(1) lt 1big)$.
  2. Find $Pbig(X(1)+X(2) lt 1big)$.
  • Solution
      1. $X(1)$ is a normal random variable with mean $E[X(1)]=0$ and variance begin{align*}%label{} textrm{Var}big(X(1)big)&=E[X(1)^2] &=R_X(0)=1. end{align*} Thus, begin{align*}%label{} Pbig(X(1) lt 1big)&=Phi left(frac{1-0}{1} right) &=Phi(1) approx 0.84 end{align*}
      2. Let $Y=X(1)+X(2)$. Then, $Y$ is a normal random variable. We have begin{align*}%label{} EY &=E[X(1)]+E[X(2)] &=0; end{align*} begin{align*}%label{} textrm{Var}(Y) &=textrm{Var}big(X(1)big)+textrm{Var}big(X(2)big)+2 textrm{Cov}big(X(1),X(2)big). end{align*} Note that begin{align*}%label{} textrm{Var}big(X(1)big)&=E[X(1)^2]-E[X(1)]^2 &=R_X(0)- mu_X^2 &=1-0=1=textrm{Var}big(X(2)big); end{align*} begin{align*}%label{} textrm{Cov}big(X(1),X(2)big)&=E[X(1)X(2)]-E[X(1)]E[X(2)] &=R_X(-1)-mu_X^2 &=e^{-1} -0=frac{1}{e}. end{align*} Therefore, begin{align*}%label{} textrm{Var}(Y) &=2+frac{2}{e}. end{align*} We conclude $Y sim N(0,2+frac{2}{e})$. Thus, begin{align*}%label{} Pbig(Y lt 1big)&=Phi left(frac{1-0}{sqrt{2+frac{2}{e}}} right) &=Phi(0.6046) approx 0.73 end{align*}

An important property of normal random processes is that wide-sense stationarity and strict-sense stationarity are equivalent for these processes. More specifically, we can state the following theorem.
Theorem Consider the Gaussian random processes $big{X(t), t in mathbb{R}big}$. If $X(t)$ is WSS, then $X(t)$ is a stationary process.
  • Proof
    • We need to show that, for all $t_1,t_2,cdots, t_r in mathbb{R}$ and all $Delta in mathbb{R}$, the joint CDF of begin{align}%label{} X(t_1), X(t_2), cdots, X(t_r)end{align} is the same as the joint CDF of begin{align}%label{} X(t_1+Delta), X(t_2+Delta), cdots, X(t_r+Delta).end{align}Since these random variables are jointly Gaussian, it suffices to show that the mean vectors and the covariance matrices are the same. To see this, note that $X(t)$ is a WSS process, sobegin{align}%label{} mu_X(t_i)=mu_X(t_j)=mu_X, quad textrm{for all }i,j,end{align}
      and
      begin{align}%label{} C_X(t_i+Delta,t_j+Delta)=C_X(t_i,t_j)=C_X(t_i-t_j), quad textrm{for all }i,j.end{align}From the above, we conclude that the mean vector and the covariance matrix ofbegin{align}%label{} X(t_1), X(t_2), cdots, X(t_r)end{align} is the same as the mean vector and the covariance matrix ofbegin{align}%label{} X(t_1+Delta), X(t_2+Delta), cdots, X(t_r+Delta).end{align}

Random Process And Noise Cancelling


Similarly, we can define jointly Gaussian random processes.
Two random processes $big{X(t), t in J big}$ and $big{Y(t), t in J' big}$ are said to be jointly Gaussian (normal), if for allbegin{align}%label{} & t_1,t_2, dots, t_m in J & quad quad textrm{and} & t'_1,t'_2, dots, t'_n in J',end{align}the random variablesbegin{align}%label{} & X(t_1), X(t_2), cdots, X(t_m), Y(t'_1), Y(t'_2), cdots, Y(t'_n)end{align}are jointly normal.
Note that from the properties of jointly normal random variables, we can conclude that if two jointly Gaussian random processes $X(t)$ and $Y(t)$ are uncorrelated, i.e.,begin{align*}%label{} C_{XY}(t_1,t_2)=0, quad textrm{for all }t_1,t_2,end{align*}then $X(t)$ and $Y(t)$ are two independent random processes.

10.2.5 Solved Problems

Problem

Consider a WSS random process $X(t)$ with begin{align} nonumber R_X(tau) =left{ begin{array}{l l} 1- tau & quad -1 leq tau leq 1 & quad 0 & quad text{otherwise} end{array} right. end{align} Find the PSD of $X(t)$, and $E[X(t)^2]$.

Random Process And Noise Ordinance

  • Solution
    • First, we have begin{align*} E[X(t)^2] &=R_X(0)=1. end{align*} We can write triangular function, $R_X(tau)=Lambda(tau)$, as begin{equation*} R_X(tau)=Pi(tau) ast Pi(tau), end{equation*} where begin{align} nonumber Pi(tau) =left{ begin{array}{l l} 1 & quad -frac{1}{2} leq tau leq frac{1}{2} & quad 0 & quad text{otherwise} end{array} right. end{align} Thus, we conclude begin{align*} S_X(f) &=mathcal{F} {R_X(tau)} &=mathcal{F} {Pi(tau) ast Pi(tau)} &=mathcal{F} {Pi(tau)} cdot mathcal{F} {Pi(tau)} &=big[textrm{sinc} (f)big]^2. end{align*}

Problem

Let $X(t)$ be a random process with mean function $mu_X(t)$ and autocorrelation function $R_X(s,t)$ ($X(t)$ is not necessarily a WSS process). Let $Y(t)$ be given by begin{align*} Y(t)&=h(t)ast X(t), end{align*} where $h(t)$ is the impulse response of the system. Show that

Sls western bridlessugars legacy stables

Random Noise Button

  1. $mu_Y(t)=mu_X(t) ast h(t)$.
  2. $R_{XY}(t_1,t_2)=h(t_2) ast R_X(t_1,t_2)=int_{-infty}^{infty} h(alpha) R_X(t_1,t_2-alpha) ; dalpha$.
  • Solution
      1. We have begin{align*} mu_Y(t)=E[Y(t)]&=Eleft[int_{-infty}^{infty} h(alpha)X(t-alpha) ; dalpharight] &=int_{-infty}^{infty} h(alpha)E[X(t-alpha)] ; dalpha &=int_{-infty}^{infty} h(alpha) mu_X(t-alpha) ; dalpha &=mu_X(t) ast h(t). end{align*}
      2. We have begin{align*} R_{XY}(t_1,t_2)=E[X(t_1)Y(t_2)]&=Eleft[X(t_1) int_{-infty}^{infty} h(alpha)X(t_2-alpha) ; dalpharight] &=Eleft[ int_{-infty}^{infty} h(alpha)X(t_1)X(t_2-alpha) ; dalpharight] &=int_{-infty}^{infty} h(alpha)E[X(t_1)X(t_2-alpha)] ; dalpha &=int_{-infty}^{infty} h(alpha) R_X(t_1,t_2-alpha) ; dalpha. end{align*}

Random Process And Noise Isolating


Problem

Prove the third part of Theorem 10.2: Let $X(t)$ be a WSS random process and $Y(t)$ be given by begin{align*} Y(t)&=h(t)ast X(t), end{align*} where $h(t)$ is the impulse response of the system. Show that begin{align*} R_{Y}(s,t)=R_Y(s-t)&=int_{-infty}^{infty} int_{-infty}^{infty} h(alpha) h(beta) R_X(s-t-alpha+beta) ; dalpha dbeta. end{align*} Also, show that we can rewrite the above integral as $R_{Y}(tau)=h(tau) ast h(-tau) ast R_X(tau)$.

  • Solution
    • begin{align*} R_{Y}(s,t)&=E[X(s)Y(t)] &=Eleft[ int_{-infty}^{infty} h(alpha)X(s-alpha) ; dalpha int_{-infty}^{infty} h(beta)X(s-beta) ; dbetaright] &= int_{-infty}^{infty} int_{-infty}^{infty} h(alpha) h(beta) E[X(s-alpha)X(t-beta)] ; dalpha ; dbeta &= int_{-infty}^{infty} int_{-infty}^{infty} h(alpha) h(beta) R_X(s-t-alpha+beta) ; dalpha ; dbeta. end{align*} We now compute $h(tau) ast h(-tau) ast R_X(tau)$. First, let $g(tau)=h(tau) ast h(-tau)$. Note that begin{align*} g(tau)&=h(tau) ast h(-tau) &= int_{-infty}^{infty} h(alpha)h(alpha-tau) ; dalpha. end{align*} Thus, we have begin{align*} g(tau) ast R_X(tau)&= int_{-infty}^{infty} g(theta) R_X(theta-tau) ; dtheta &=int_{-infty}^{infty} left[int_{-infty}^{infty} h(alpha)h(alpha-theta) ; dalpha right] R_X(theta-tau) ; dtheta &=int_{-infty}^{infty} int_{-infty}^{infty} h(alpha)h(alpha-theta) R_X(theta-tau) ; dalpha ; dtheta &=int_{-infty}^{infty} int_{-infty}^{infty} h(alpha)h(beta) R_X(alpha-beta-tau) ; dalpha ; dbeta &=int_{-infty}^{infty} int_{-infty}^{infty} h(alpha)h(beta) R_X(tau-alpha+beta) ; dalpha ; dbeta &big(textrm{since } R_X(-tau)=R_X(tau)big). end{align*}

Problem

Let $X(t)$ be a WSS random process. Assuming that $S_X(f)$ is continuous at $f_1$, show that $S_X(f_1) geq 0$.

  • Solution
    • Let $f_1 in mathbb{R}$. Suppose that $X(t)$ goes through an LTI system with the following transfer function begin{align*} H(f)=left{ begin{array}{l l} 1 & quad f_1 lt f lt f_1+Delta & quad 0 & quad text{otherwise} end{array} right. end{align*} where $Delta$ is chosen to be very small. The PSD of $Y(t)$ is given by begin{align*} S_Y(f)=S_X(f) H(f) ^2=left{ begin{array}{l l} S_X(f) & quad f_1 lt f lt f_1+Delta & quad 0 & quad text{otherwise} end{array} right. end{align*} Thus, the power in $Y(t)$ is begin{align*} E[Y(t)^2] & =int_{-infty}^{infty} S_Y(f) ; df &=2int_{f_1}^{f_1+Delta} S_X(f) ; df &approx 2 Delta S_X(f_1). end{align*} Since $E[Y(t)^2] geq 0$, we conclude that $S_X(f_1) geq 0$.

Random Noise Generator

Problem

White Noise Process

Let $X(t)$ be a white Gaussian noise with $S_X(f)=frac{N_0}{2}$. Assume that $X(t)$ is input to an LTI system with begin{align*} h(t)=e^{-t}u(t). end{align*} Let $Y(t)$ be the output.

  1. Find $S_Y(f)$.
  2. Find $R_Y(tau)$.
  3. Find $E[Y(t)^2]$.
  • Solution
    • First, note that begin{align*} H(f)&=mathcal{F} {h(t)} &=frac{1}{1+j2 pi f}. end{align*}
      1. To find $S_Y(f)$, we can write begin{align*} S_Y(f) &=S_X(f) H(f) ^2 &=frac{N_0/2}{1+(2 pi f)^2}. end{align*}
      2. To find $R_Y(tau)$, we can write begin{align*} R_Y(tau)&=mathcal{F}^{-1}{S_Y(f)} &=frac{N_0}{4}e^{- tau }. end{align*}
      3. We have begin{align*} E[Y(t)^2]&=R_Y(0) &=frac{N_0}{4}. end{align*}

Random Process And Noise Pollution