Relation between entropy & mutual information

Q.  The relation between entropy and mutual information is
- Published on 04 Nov 15

a. I(X;Y) = H(X) - H(X/Y)
b. I(X;Y) = H(X/Y) - H(Y/X)
c. I(X;Y) = H(X) - H(Y)
d. I(X;Y) = H(Y) - H(X)

ANSWER: I(X;Y) = H(X) - H(X/Y)

Post your comment / Share knowledge


Enter the code shown above:

(Note: If you cannot read the numbers in the above image, reload the page to generate a new one.)