Random Processes MCQ || Random Processes Questions and Answers

1. Gaussian process is a

  1. Wide sense stationary process
  2. Strict sense stationary process
  3. Both 1 and 2
  4. None of the mentioned

Answer.3. Both 1 and 2

Explanation

  • In probability theory and statistics, a Gaussian process is a stochastic process (a collection of random variables indexed by time or space), such that every finite collection of those random variables has a multivariate normal distribution, i.e. every finite linear combination of them is normally distributed.
  • A Gaussian stochastic process is a strict-sense stationery if, and only if, it is wide-sense stationery.

 

2. The random variable

$Y = \mathop \smallint \limits_{ – \infty }^\infty W\left( t \right)\phi \left( t \right)dt,$ where

$\phi \left( t \right)=\;\left\{ {\begin{array}{*{20}{c}} {1;}&{5 \le t \le 7}\\ {0;}&{Otherwise} \end{array}} \right\}$

And W(t) is a real white Gaussian noise process with two-sided power spectral density SW (f) = 3 W/Hz, for all f. The variance of Y is:

  1. 6
  2. 10
  3. 3
  4. 14

Answer.1. 6

Explanation

The mean or the expected value of a Gaussian random variable is 0.

The variance of a random variable ‘X’ is defined as:

$E\left[ {{X^2}} \right] = E\left[ {X \cdot X} \right]$

Also, the autocorrelation function and the Energy spectrum density form Fourier transform pairs, i.e.

${R_w}\left( \tau \right)\mathop \leftrightarrow \limits^{FT} {S_w}\left( f \right)$

Application:

Sw(f) = 3 Watt/Hz

Since the power spectral density is constant, the autocorrelation function will be an impulse function, i.e.

${S_w}\left( f \right)\;\mathop \leftrightarrow \limits^{IFT} {R_w}\left( \tau \right)$

∴ Rw(τ) = 3 δ (τ)

Also, Rw (t2 – t1) = 3δ (t2 – t1) …1)

Now,

$E\left[ Y \right] = E\left[ {\mathop \smallint \nolimits_{ – \infty }^\infty w\left( t \right) \cdot \phi \left( t \right)dt} \right]$

$E\left[ Y \right] = \mathop \smallint \nolimits_{ – \infty }^\infty E\left[ {w\left( t \right)} \right]\phi \left( t \right)\;dt$

Since W(t) is a real white Gaussian noise process:

E[w(t)] = 0

∴ E[Y] = 0

Now, the variance of Y can be written as:

$E\left[ {{Y^2}} \right] = E\left[ {Y \cdot Y} \right]$

Putting on the respective functions, we get:

$ = E\left[ {\mathop \smallint \nolimits_{-\infty} ^{ \infty } w\left( {{t_1}} \right)\phi \left( {{t_1}} \right)d{t_1}\mathop \smallint \nolimits_{ – \infty }^\infty w\left( {{t_2}} \right)\phi \left( {{t_2}} \right)d{t_2}} \right]$

$ = E\left[ {\mathop \smallint \nolimits_{ – \infty }^\infty \mathop \smallint \nolimits_{ – \infty }^\infty w\left( {{t_1}} \right)w\left( {{t_2}} \right)\phi \left( {{t_1}} \right)\phi \left( {{t_2}} \right)d{t_1}d{t_2}} \right]$

$= \mathop \smallint \nolimits_{ – \infty }^\infty \mathop \smallint \nolimits_{ – \infty }^\infty E\left[ {w\left( {{t_1}} \right)w\left( {{t_2}} \right)} \right] \cdot \phi \left( {{t_1}} \right)\phi \left( {{t_2}} \right)d{t_1}d{t_2}$

$= \mathop \smallint \nolimits_{ – \infty }^\infty \mathop \smallint \nolimits_{ – \infty }^\infty {R_w}\left( {{t_2} – {t_1}} \right) \cdot \phi \left( {{t_1}} \right)\phi \left( {{t_2}} \right)d{t_1}d{t_2}$

From Equation (1), we can write:

$E\left[ {{Y^2}} \right] = \mathop \smallint \nolimits_{ – \infty }^\infty \mathop \smallint \nolimits_{ – \infty }^\infty 3\delta \left( {{t_2} – {t_1}} \right)\phi \left( {{t_1}} \right)\phi \left( {{t_2}} \right)d{t_1}d{t_2}$

The above integration exists, provided:

t1 = t2 = t

$E\left[ {{Y^2}} \right] = \mathop \smallint \nolimits_{ – \infty }^\infty \mathop \smallint \nolimits_{ – \infty }^\infty 3\delta \left( 0 \right){\phi ^2}\left( t \right) \cdot dt$

$ = 3 \times 1 \times \mathop \smallint \nolimits_5^7 {\phi ^2}\left( t \right) \cdot dt$

= 3 × 2 = 6

 

3. Consider a random process X(𝑡) = √2 sin(2𝜋t + 𝜑), where the random phase 𝜑 is uniformly distributed in the interval [0,2𝜋]. The auto-correlation R[𝑡1 , 𝑡2] is

  1. cos (2π (t1 + t2))
  2. sin (2π (t1 – t2))
  3. sin (2π (t1 + t2))
  4. cos (2π (t1 – t2))

Answer.4. cos (2π (t1 – t2))

Explanation

We have the random process as:

$X\left( t \right) = \sqrt 2 \sin \left( {2\pi t + \phi } \right)$

Where random phase ϕ is uniformly distributed in the interval [0, 2π]. So, we obtain the probability density function as

${f_\phi }\left( \phi \right) = \frac{1}{{2\pi }}$

Therefore, the auto-correlation is given as

$E\left[ {X\left( {{t_1}} \right)X\left( {{t_2}} \right)} \right] = \mathop \smallint \limits_0^{2\pi } X\left( {{t_1}} \right)X\left( {{t_2}} \right){f_\phi }\left( \phi \right)d\phi $

$ \mathop \smallint \limits_0^{2\pi } \left[ {\sqrt 2 \sin \left( {2\pi {t_1} + \phi } \right)} \right]\left[ {\sqrt 2 \sin \left( {2\pi {t_2} + \phi } \right)} \right]\frac{1}{{2\pi }}d\phi \;$

$= \frac{2}{{2\pi }}\mathop \smallint \limits_0^{2\pi } \sin \left( {2\pi {t_1} + \phi } \right)\sin \left( {2\pi {t_2} + \phi } \right)d\phi $

Using the trigonometric relation,

sin A sin B = 0.5[cos(A – B) – cos(A – B)]

We get E[X(t1)X(t2)] as:

$ = \frac{1}{{2\pi }}\mathop \smallint \limits_0^{2\pi } \cos \left( {2\pi {t_1} – 2\pi {t_2}} \right) – \cos \left[ {2\pi \left( {{t_1} + {t_2}} \right) + \phi } \right]d\phi $

$= \frac{1}{{2\pi }}\cos \left[ {2\pi \left( {{t_1} – {t_2}} \right)} \right] \times 2\pi $

= cos [2π (t1 – t2)]

 

3. Stochastic processes are

  1. Strict sense stationary process
  2. Wide sense stationary process
  3. Both 1 and 2
  4. None of the mentioned

Answer.2. Wide sense stationary process

Explanation

A stochastic process, also known as a random process, is a collection of random variables that are indexed by some mathematical set. Each probability and random process are uniquely associated with an element in the set.

A stochastic process is a wide sense stationary process. A random process is called weak-sense stationery or wide-sense stationery (WSS) if its mean function and its correlation function do not change by shifts in time.

 

4. One of the following types of noise assumes importance at high frequencies:

  1. Thermal noise
  2. Shot noise
  3. Transit-time noise
  4. Flicker noise

Answer.2. Wide sense stationary process

Explanation

High- Frequency Noise is also known as TRANSIT- TIME Noise. They are observed in the semiconductor devices when the transit time of a charge carrier while crossing a junction is compared with the time period of that signal.

 

5. A random process X(t) is called ‘white noise’ if the power spectral density is equal to

  1. η/8
  2. η/2
  3. 3η/4
  4. η

Answer.2. η/2

Explanation

The power spectral density is basically the Fourier transform of the autocorrelation function of the power signal, i.e.

Sx(f) = F.T.{Rx(τ)}

Also, the inverse Fourier transform of a constant function is a unit impulse.

Application:

The power spectral density of white noise is given by:

Sx(f) = η/2 for all frequency ‘f’, i.e.

Now auto-correlation is inverse Fourier transform (IFT) of power spectral density function.

$\rm{R_x\left(\tau\right)\mathop \to \limits^{IFT}S_X\left(f\right)}$

The inverse Fourier transform of the power spectrum of white noise will be an impulse as shown:

$\rm{R_x\left(\tau\right)=\frac{\eta}{2}\delta\left(\tau\right)}$

 

6. In Poisson distribution mean is ______ variance.

  1. Greater than
  2. Lesser than
  3. Equal to
  4. Does not depend on

Answer.3. Equal to

Explanation

In statistics, a Poisson distribution is a probability distribution that is used to show how many times an event is likely to occur over a specified period. In other words, it is a count distribution. n Poisson distribution mean is equal to the variance. The Poisson distribution has a particularly simple mean, E ( X ) = λ, and variance, V ( X ) = λ.

 

7. The response of a Gaussian random process applied to a stable linear system is:

1. A Gaussian random process

2. Not a Gaussian random process

3. Completely specified by its mean and auto-covariance functions

Which of the above statements is/are correct?

  1. 1 only
  2. 2 only
  3. 2 and 3
  4. 1 and 3

Answer.4. 1 and 3

Explanation

An important property of Gaussian random process is that their Probability density function is completely determined by their mean and covariance, i.e.

${f_X}\left( X \right) = \frac{1}{{\sqrt {2\pi {\sigma ^2}} }}{e^{ – \frac{{{{\left( {x – \mu } \right)}^2}}}{{2{\sigma ^2}}}}}$

Where μ = mean

σ = Standard deviation.

Fourier analysis states that any Linear system in the frequency spectrum can be represented as the sum of sinusoids, i.e.

$H\left( \omega \right) = \mathop \sum \limits_{n \to – \infty }^{n \to \infty } sin\left( {n{\omega _0}} \right)$

Now, If the Input (x(t)) is Gaussian, then the output y(t) in the frequency spectrum will be:

Y(ω) = X(ω) × H(ω)

For a sinusoid input, the output will be varied in Magnitude and phase, but nature will remain the same according to the input.

So, the output will be Gaussian

Observations:

1) The output phase and magnitude will be varied, but nature will remain the same as that of the input.

2) So, if the input is Gaussian, the output will also be Gaussian.

 

8. The autocorrelation function of the sinusoidal signal x(t) = A cos (ω0 t + ϕ) is:

  1. R(τ) = A2 × cos (ω0 τ)
  2. $R\left( \tau \right) = \frac{{{A^2}\; \times \;\cos \left( {{\omega _0}\tau } \right)}}{2}$
  3. $R\left( \tau \right) = \frac{{{A^2}\; \times \;\cos \left( {{\omega _0}\tau } \right)}}{4}$
  4. $R\left( \tau \right) = \frac{{{A^2}\; \times \;\cosh \left( {{\omega _0}\tau } \right)}}{2}$

Answer.2. $R\left( \tau \right) = \frac{{{A^2}\; \times \;\cos \left( {{\omega _0}\tau } \right)}}{2}$

Explanation

Autocorrelation function [R (τ)] of the power signal is given as:

$R\left( \tau \right) = \mathop {{\rm{lt}}}\limits_{t \to \infty } \frac{ \bot }{T}\mathop \smallint \nolimits_{ – T/2}^{T/2} x\left( t \right)x\left( {t – \tau } \right)dt$

Calculation:

Given:

x(t) = A cos (ω0t + ϕ)

$ACF:R\left( \tau \right) = \mathop {{\rm{lt}}}\limits_{t \to \infty } \frac{ \bot }{T}\mathop \smallint \nolimits_{ – T/2}^{T/2} x\left( t \right) \cdot x\left( {t – \tau } \right)dt$

$\mathop {{\rm{Lt}}}\limits_{t \to \infty } \frac{ \bot }{T}\mathop \smallint \nolimits_{ – T/2}^{T/2} \left[ {A\cos \left( {{\omega _0}t + \phi } \right) \cdot A\cos \left( {{\omega _0}\left( {t – \tau } \right) + \phi } \right)} \right] \cdot dt$

$\mathop {{\rm{Lt}}}\limits_{T \to \infty } \frac{{{A^2}}}{{2T}}\mathop \smallint \nolimits_{ – T/2}^{T/2} \left[ {\cos \left( {2{\omega _0}t – {\omega _0}\tau + 2\phi } \right) + \cos {\omega _0}\tau } \right]dt$

$\mathop {{\rm{Lt}}}\limits_{T \to \infty } \frac{{{A^2}}}{{2T}}\cos {\omega _0}\tau \mathop \smallint \nolimits_{ – T/2}^{T/2} \cdot dt$

$\therefore \mathop \smallint \nolimits_{ – T/2}^{T/2} \cos \left( {2{\omega _0}t – {\omega _0}\tau + 2\phi } \right) = 0$

$\mathop {{\rm{Lt}}}\limits_{T \to \infty } \frac{{{A^2}}}{{2T}}\cos {\omega _0}\tau .T$

$R\left( \tau \right) = \frac{{{A^2}}}{2}\cos {\omega _0}\tau $

 

9. Binomial distribution deals with

  1. Continuous random variable
  2. Discrete random variable
  3. Continuous & Discrete random variable
  4. None of the mentioned

Answer.2. Discrete random variable

Explanation

Binomial distribution summarizes the number of trials, or observations when each trial has the same probability of attaining one particular value.

The binomial distribution determines the probability of observing a specified number of successful outcomes in a specified number of trials.

The binomial distribution is a special discrete distribution where there are two distinct complementary outcomes, a “success” and a “failure”.

 

10. The noise caused by random variations in the arrival of electrons or holes at the output electrode of an amplifying device

  1. White noise
  2. Flicker
  3. Shot Noise
  4. Transmit time noise

Answer.3. Shot Noise

Explanation

Shot noise:

  • It is a type of noise that appears in all amplifying & active devices.
  • It is associated with a current flow
  • The noise caused by random variations in the arrival of electrons or holes at the output electrode of an amplifying device is called shot noise.
  • In semiconductor devices, shot Noise is due to the random variation in the diffusion of minority carriers
  • Current is a flow of carriers each of which carries a finite amount of charge.

Scroll to Top