conditional entropy

EnglishEdit

Wikipedia has an article on:

Wikipedia

NounEdit

conditional entropy (plural conditional entropies)

  1. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
    The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to  H(Y) - I(Y;X) where I(Y;X) is the mutual information between Y and X.

Related termsEdit

Last modified on 18 June 2013, at 21:06