This article will be permanently flagged as inappropriate and made unaccessible to everyone. Are you certain this article is inappropriate? Excessive Violence Sexual Content Political / Social
Email Address:
Article Id: WHEBN0000910967 Reproduction Date:
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.
The joint Shannon entropy of two variables X and Y is defined as
where x and y are particular values of X and Y, respectively, P(x,y) is the joint probability of these values occurring together, and P(x,y) \log_2[P(x,y)] is defined to be 0 if P(x,y)=0.
For more than two variables X_1, ..., X_n this expands to
where x_1,...,x_n are particular values of X_1,...,X_n, respectively, P(x_1, ..., x_n) is the probability of these values occurring together, and P(x_1, ..., x_n) \log_2[P(x_1, ..., x_n)] is defined to be 0 if P(x_1, ..., x_n)=0.
The joint entropy of a set of variables is greater than or equal to all of the individual entropies of the variables in the set.
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of subadditivity. This inequality is an equality if and only if X and Y are statistically independent.
Joint entropy is used in the definition of conditional entropy
and mutual information
In quantum information theory, the joint entropy is generalized into the joint quantum entropy.
Information theory, Integral, Information, Measure theory, Set theory
Computer science, Cryptography, Statistics, Data compression, Mathematics
Probability, Infinity, Stochastic process, Countable, Asymptotic equipartition property