conditional entropy

EnglishEdit

 
Wikipedia has an article on:
Wikipedia

NounEdit

conditional entropy ‎(plural conditional entropies)

  1. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
    The conditional entropy of random variable   given   (i.e., conditioned by  ), denoted as  , is equal to   where   is the mutual information between   and  .

Related termsEdit