joint entropy

EnglishEdit

 
Wikipedia has an article on:
Wikipedia

NounEdit

joint entropy ‎(plural joint entropies)

  1. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
    If random variables   and   are mutually independent, then their joint entropy   is just the sum   of its component entropies. If they are not mutually independent, then their joint entropy will be   where   is the mutual information of   and  .

Related termsEdit