Next: Mutual information
Up: Appendix
Previous: Appendix
The entropy of a distribution
is defined as
Entropy represents the uncertainty of the random variable. Among all
distributions, uniform distribution has maximum entropy over a finite
region
, while Gaussian distribution has maximum entropy over
the entire real axis.
The joint entropy of two random variables
and
is defined as
The conditional entropy of
given
is
and the conditional entropy of
given
is
Ruye Wang
2018-03-26