Next: Appendix B: Matrix Operations
Up: MCMC and EM Algorithms
Previous: EM Method for Parameter
The KL-divergence between two distributions
and
is defined as
where
is the entropy of distribution
,
is the cross-entropy
of distributions
and
, and their difference, also called relative
entropy, represents the divergence or difference between the two distributions.
According Gibbs' inequality,
, with the equality holds
if and only if
. Therefore
.
Ruye Wang
2006-10-11