The '''conditional entropy''' measures how much [[entropy]] a [[random variable]] has remaining if we have already learned the value of a second random variable . It is referred to as ''the entropy of conditional on '', and is written . [[Image:classinfo.png]]
If the probability that is denoted by , then we donote by the probability
that , given that we already know that . is a [[conditional probability]].
In [[Baysian]] language, represents our [[prior information]] information about .
The conditional entropy is just the Shannon entropy with replacing , and then we average
it over all possible "Y".
H(X|Y):=\sum_{xy} p(x|y)\log p(x|y) p(y).
Using the [[Baysian sum rule]] , one finds that the conditional entropy is equal to
H(X|Y) = H(X,Y) - H(Y)
with "H(XY)" the [[joint entropy]] of "X" and "Y".
==See Also==
*[[Quantum conditional entropy]]
*[[Mutual information]]
[[Category:Handbook of Quantum Information]]
[[Category:Classical Information Theory]]
[[Category:Entropy]]