# joint entropy

## EnglishEdit

Wikipedia has an article on:
Wikipedia

### NounEdit

joint entropy (plural joint entropies)

1. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
If random variables ${\displaystyle X}$  and ${\displaystyle Y}$  are mutually independent, then their joint entropy ${\displaystyle H(X,Y)}$  is just the sum ${\displaystyle H(X)+H(Y)}$  of its component entropies. If they are not mutually independent, then their joint entropy will be ${\displaystyle H(X)+H(Y)-I(X;Y)}$  where ${\displaystyle I(X;Y)}$  is the mutual information of ${\displaystyle X}$  and ${\displaystyle Y}$ .