Amount of Information & average information, Entropy - MCQs

Amount of Information & average information, Entropy - MCQs


Q1. The expected information contained in a message is called

a) Entropy
b) Efficiency
c) Coded signal
d) None of the above

View Answer / Hide Answer

ANSWER: a) Entropy



Q2. The information I contained in a message with probability of occurrence is given by (k is constant)

a) I = k log21/P
b) I = k log2P
c) I = k log21/2P
d) I = k log21/P2

View Answer / Hide Answer

ANSWER: a) I = k log21/P



Q3. The memory less source refers to

a) No previous information
b) No message storage
c) Emitted message is independent of previous message
d) None of the above

View Answer / Hide Answer

ANSWER: c) Emitted message is independent of previous message



Q4. Entropy is

a) Average information per message
b) Information in a signal
c) Amplitude of signal
d) All of the above

View Answer / Hide Answer

ANSWER: a) Average information per message



Q5. The relation between entropy and mutual information is

a) I(X;Y) = H(X) - H(X/Y)
b) I(X;Y) = H(X/Y) - H(Y/X)
c) I(X;Y) = H(X) - H(Y)
d) I(X;Y) = H(Y) - H(X)

View Answer / Hide Answer

ANSWER: a) I(X;Y) = H(X) - H(X/Y)



Q6. The mutual information

a) Is symmetric
b) Always non negative
c) Both a) and b) are correct
d) None of the above

View Answer / Hide Answer

ANSWER: c) Both a) and b) are correct


Post your comment