conditional entropy

EnglishEdit

Wikipedia has an article on:
Wikipedia

NounEdit

1. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
The conditional entropy of random variable ${\displaystyle Y}$  given ${\displaystyle X}$  (i.e., conditioned by ${\displaystyle X}$ ), denoted as ${\displaystyle H(Y|X)}$ , is equal to ${\displaystyle H(Y)-I(Y;X)}$  where ${\displaystyle I(Y;X)}$  is the mutual information between ${\displaystyle Y}$  and ${\displaystyle X}$ .