Probability and Random Variable MCQ || Random Variables Questions and Answers

11. A power signal x(t) which has Power Spectral Density as a constant K, is applied to a low-pass filter-RC. Mean square value of output will be:

  1. 2RC/K
  2. K/RC
  3. RC/K
  4. K/2RC

Answer.4. K/2RC

Explanation

The frequency response of the RC filter is:

$\rm H\left( ω \right) = \frac{1}{{1 + jω RC}}$

The power spectral density is given as:

SXX(ω) = K

The output power spectral density is related to the input power spectral density by the following relation:

$ {S_{YY}}\left( ω \right) = {\left| {H\left( ω \right)} \right|^2}{S_{XX}}\left( ω \right)$

${S_{YY}}\left( ω \right) = \frac{1}{{1 + {{\left( {ω RC} \right)}^2}}}.K$

${S_{YY}}\left( ω \right) = K.\frac{1}{{2RC}}.\frac{{2\left[ {1/RC} \right]}}{{{ω ^2} + {{\left[ {1/\left( {RC} \right)} \right]}^2}}} $

Also, the autocorrelation and the power spectral density form a Fourier transform pair, i.e.

$\rm {R_{YY}}\left( ω \right)\mathop \leftrightarrow \limits^{FT} {S_{YY}}\left( ω \right)$

Thus, using the relation:

$\rm {e^{ – a\left| t \right|}}\mathop \leftrightarrow \limits^{FT} \frac{{2a}}{{{a^2} + {ω ^2}}}$ we have inverse Fourier transform of  $\rm {S_{YY}}\left( ω \right)$ as:

$\rm {R_{YY}}\left( \tau \right) =K.\frac{1}{{2RC}}.{e^{ – \frac{{\left| \tau \right|}}{{RC}}}}$

Now, the average power $\rm E\left[ {{Y^2}\left( t \right)} \right]$ will be:

$\rm E\left[ {{Y^2}\left( t \right)} \right] = {R_{YY}}\left( 0 \right) = K.\frac{1}{{2RC}}$

$\therefore \rm E\left[ {{Y^2}\left( t \right)} \right] = \frac{K}{{2RC}}$

 

12. Let X be a real-valued random variable with E[X] and E[X2] denoting the mean values of X and X2, respectively. The relation which always holds is

  1. (E[X])2 > E(X2)
  2. E(X2) ≥ (E[X])2
  3. E[X2] = (E[X])2
  4. E[X2] > (E[X])2

Answer.2. E(X2) ≥ (E[X])2

Explanation

E[X] denotes the mean value of ‘X’

E[X2] denotes the mean value of X2

Also, the variance of Random variable ‘X’ is given as:

Var(X) = E(X2) – [E(X)]2

Note: Variance is always non-negative, i.e.

E[X2] – [E(X)]2 ≥ 0

∴ E[X2] ≥ (E[X])2

 

13. A digital communication system transmits a block of N bits. The probability of error in decoding a bit is α. The error event of each bit is independent of the error events of the other bits. The received block is declared erroneous if at least one of its bits is decoded wrongly. The probability that the received block is erroneous is

  1. N(1 – α)
  2. αN
  3. 1 – αN
  4. 1 – (1 – α)N

Answer.4. 1 – (1 – α)N

Explanation

Since all the bits are independent of each other, the probability that all the bits are received

correctly (PC) will be:

PC = (1 − α)1 × (1 − α)2 × _ _ _ _ _(1 − α)N 

PC = (1 − α)N

where α is the probability of error in detection.

Now, the probability that the received block is erroneous will be:

Pe = 1 – PC

Pe = 1 – (1 – α)N

 

14.  Find median and mode of the messages received on 9 consecutive days 15, 11, 9, 5, 18, 4, 15, 13, 17.

  1. 13, 6
  2. 13, 18
  3. 18, 15
  4. 15, 16

Answer.2. 13, 18

Explanation

Arranging the terms in ascending order 4, 5, 9, 11, 13, 14, 15, 18, 18.

Median is (n + 1)/2 term as n = 9 (odd) = 13.

Mode = 18 which is repeated twice.

 

15. Bits 1 and 0 are transmitted with equal probability. At the receiver, the probability density function of the respective received signals for both bit are as shown below

Bits 1 and 0 are transmitted with equal probability. At the receiver, the probability density function of the respective received signals for both bit are as shown below

  1. 1/2
  2. 1/4
  3. 1/16
  4. 1/32

Answer.3. 1/16

Explanation

The received signal is said to an error signal if we receive 1 but transmitted signal was 0, or if we receive 0 but transmitted signal was 1.

$\therefore P\left( e \right) = P\left( 0 \right)P\left( {\frac{1}{0}} \right) + P\left( 1 \right)P\left( {\frac{0}{1}} \right)$

where,

P(e) = Probability of error

P(0) = Probability of transmission of ‘0’

P(1) = Probability of transmission of ‘1’

P(0/1) = Probability of reception of ‘0’ on the transmission of ‘1’

P(1/0) = Probability of reception of ‘1’ on the transmission of ‘0’

The area under the curve of the probability density function gives the probability.

Bits 1 and 0 are transmitted with equal probability. At the receiver, the probability density function of the respective received signals for both bit are as shown below

There is no error in the reception of ‘0’.

But there is an error in reception of ‘1’

Bits 1 and 0 are transmitted with equal probability. At the receiver, the probability density function of the respective received signals for both bit are as shown below

$\begin{array}{l} \therefore P\left( e \right) = P\left( 0 \right)P\left( {\frac{1}{0}} \right) + P\left( 1 \right)P\left( {\frac{0}{1}} \right)\\ \\ = \left( {\frac{1}{2}} \right)\left( 0 \right) + \frac{1}{2} \times \frac{1}{2} \times 1 \times 0.25\\ \\ P(e) = \frac{1}{2} \times \frac{1}{2} \times \frac{1}{4} \end{array}$

P(e) = 1/16

 

16. What is the probability of getting seven heads in 12 flips of an unbiased coin?

  1. 0.19
  2. 0.49
  3. 0.29
  4. 0.91

Answer.1. 0.19

Explanation

The formula for binomial distribution

P(x) = nCx.Px.qn − x

where,

p = probability of success of a trial

q = (1- p) = probability of failure of a trail

n = total no of trial

x = no of successful trial of for the given condition.

${n_{{C_x}}} = \frac{{n!}}{{\left( {n – x} \right)!x!}}$

number of combination of n taken x at a time

P(x) = probability of success of x trial

Given:

no of observation (n) = 12, probability of success (p) = 1/2,  probability of failure (q) = (1 −½) = ½  x = 7

P(x) = nCx.Px.qn − x

$\begin{array}{l} P(x) = \frac{{12!}}{{\left( {12 – 7} \right)!{\rm{\;}} \times 7!}} \times {\left( {\frac{1}{2}} \right)^7} \times {\left( {\frac{1}{2}} \right)^5}\\ \\ P(x) = \frac{{12 \times 11 \times 10 \times 9 \times 8 \times 7!}}{{\left( {5 \times 4 \times 3 \times 2 \times 1} \right) \times 7!}} \times {\left( {\frac{1}{2}} \right)^7} \times {\left( {\frac{1}{2}} \right)^5} \end{array}$

P(x)=12×11×10×9×8×7!(5×4×3×2×1)×7!×(12)7×(12)5

P(x) = 0.19

 

17. A coin is tossed up 4 times. The probability that tails turn up in 3 cases is ______

  1. 1/2
  2. 1/3
  3. 1/4
  4. 1/6

Answer.1. 1/2

Explanation

p = 0.5 (Probability of tail)

q = 1 − 0.5 = 0.5

n = 4 and x is binomial variate.

P (X = x) = nCx px qn-x.

P (X = 3) = 4C3 (0.5)3 = 12.

 

18. Which of the following statistical method can be used for a single sample data?

  1. Frequency distribution
  2. Uncertainty distribution
  3. Standard deviation
  4. None of the above

Answer.1. Frequency distribution

Explanation

In statistics, a frequency distribution is a list, table, or graph that displays the frequency of various outcomes in a sample.

The frequency of observation tells you the number of times the observation occurs in the data.

  • Each entry in the table contains the frequency or count of the occurrences of values within a particular group or interval.
  • Discrete data is generated by counting each and every observation.
  • When an observation is repeated, it is counted. The number for which the observation is repeated is called the frequency of that observation.
  • The class limits in discrete data are true class limits, there are no class boundaries in discrete data.

 

19. If E denotes the expectation the variance of a random variable X is denoted as?

  1. (E(X))2
  2. E(X2) − (E(X))2
  3. E(X2)
  4. 2E(X)

Answer.2. E(X2) − (E(X))2

Explanation

This property of the mathematical expectation states that if there is an X and Y, then the product of those two random variables is equal to the product of the mathematical expectation of the individual random variables. In other words, E(X2) − (E(X))2, provided all the expectations exist.

 

20. Let X ∈ {0,1} and Y ∈ {0,1} be two independent binary random variables. If P(X = 0) = p and P(Y = 0) = q, then P(X + Y) ≥ 1 is equal to

  1. pq + (1 – p)(1 – q)
  2. pq
  3. p(1 – q)
  4. 1 – pq

Answer.4. 1 – pq

Explanation

1) P(X = 0) + P(X = 1) = 1

2) If Z = X + Y, then pdf (Z) = convolution of pdf (x) and pdf (y)

Application:

Given:

P(X = 0) = p

∴ P(X = 1) = 1 – p

Also Given: P(Y = 0) = q

∴ P (Y = 1) = 1 – q

P(X + Y ≥ 1) = P (X = 0) . P(Y = 1) + P(X = 1) . P(Y = 0) + P(X = 1) . P(Y = 1)

= p . (1 – q) + (1 – p) . q + (1 – p) . (1 – q) = 1 – pq

 

21. The spectral density and autocorrelation function of white noise is, respectively;

  1. Delta and Uniform
  2. Uniform and Delta
  3. Gaussian and Uniform
  4. Gaussian and Delta

Answer.2. Uniform and Delta

Explanation

The spectral density of white noise is Uniform and the autocorrelation function of White noise is the Delta function. A white noise process is a random process of random variables that are uncorrelated, have mean zero, and a finite variance.

 

22. The random variables X and Y have variances 0.2 and 0.5 respectively. Let Z= 5X − 2Y. The variance of Z is?

  1. 3
  2. 4
  3. 5
  4. 7

Answer.4. 7

Explanation

Var(X) = 0.2, Var(Y) = 0.5

Z = 5X – 2Y

Var(Z) = Var(5X – 2Y)

= Var(5X) + Var(2Y)

= 25Var(X) + 4Var(Y)

Var(Z) = 7.

Scroll to Top