1. Self-information should be
A. Positive
B. Negative
C. Positive & Negative
D. None of the mentioned
2. The unit of average mutual information is
A. Bits
B. Bytes
C. Bits per symbol
D. Bytes per symbol
3. When the probability of error during transmission is 0.5, it indicates that
A. Channel is very noisy
B. No information is received
C. Channel is very noisy & No information is received
D. None of the mentioned
4. Binary Huffman coding is a
A. Prefix condition code
B. Suffix condition code
C. Prefix & Suffix condition code
D. None of the mentioned
5. The event with minimum probability has the least number of bits.
A. True
B. False
6. The method of converting a word to a stream of bits is called as
A. Binary coding
B. Source coding
C. Bit coding
D. Cipher coding
7. When the base of the logarithm is 2, then the unit of measure of information is
A. Bits
B. Bytes
C. Nats
D. None of the mentioned
8. When X and Y are statistically independent, then I (x,y) is
A. 1
B. 0
C. Ln 2
D. Cannot be determined
9. The self-information of a random variable is
A. 0
B. 1
C. Infinite
D. Cannot be determined
10. Entropy of a random variable is
A. 0
B. 1
C. Infinite
D. Cannot be determined