## EnglishEdit

### NounEdit

**conditional entropy** (*plural* **conditional entropies**)

- (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
*The***conditional entropy**of random variable given (i.e., conditioned by ), denoted as , is equal to where is the mutual information between and .