Information Theory and Coding MCQ [Free PDF] – Objective Question Answer for Information Theory and Coding Quiz

1. Self-information should be

A. Positive
B. Negative
C. Positive & Negative
D. None of the mentioned

Answer: A

Self-information is always non-negative.

 

2. The unit of average mutual information is

A. Bits
B. Bytes
C. Bits per symbol
D. Bytes per symbol

Answer: A

The unit of average mutual information is bits.

 

3. When the probability of error during transmission is 0.5, it indicates that

A. Channel is very noisy
B. No information is received
C. Channel is very noisy & No information is received
D. None of the mentioned

Answer: C

When the probability of error during transmission is 0.5 then the channel is very noisy and thus no information is received.

 

4. Binary Huffman coding is a

A. Prefix condition code
B. Suffix condition code
C. Prefix & Suffix condition code
D. None of the mentioned

Answer: A

Binary Huffman coding is a prefix condition code.

 

5. The event with minimum probability has the least number of bits.

A. True
B. False

Answer: B

In binary Huffman coding the event with maximum probability has the least number of bits.

h({});

 

6. The method of converting a word to a stream of bits is called as

A. Binary coding
B. Source coding
C. Bit coding
D. Cipher coding

Answer: B

Source coding is the method of converting a word to a stream of bits that is 0’s and 1’s.

 

7. When the base of the logarithm is 2, then the unit of measure of information is

A. Bits
B. Bytes
C. Nats
D. None of the mentioned

Answer: A

When the base of the logarithm is 2 then the unit of measure of information is bits.

 

8. When X and Y are statistically independent, then I (x,y) is

A. 1
B. 0
C. Ln 2
D. Cannot be determined

Answer: B

When X and Y are statistically independent the measure of information I (x,y) is 0.

 

9. The self-information of a random variable is

A. 0
B. 1
C. Infinite
D. Cannot be determined

Answer: C

The self-information of a random variable is infinity.

 

10. Entropy of a random variable is

A. 0
B. 1
C. Infinite
D. Cannot be determined

Answer: C

The entropy of a random variable is also infinity.

Scroll to Top