Next: Appendix B: Jensen's Inequality
Up: MCMC
Previous: Gibbs Sampling
The KL-divergence between two distributions
and
is defined as
where
is the entropy of distribution
,
is the cross-entropy
of distributions
and
, and their difference, also called relative
entropy, represents the divergence or difference between the two distributions.
According Gibbs' inequality,
, with the equality holds
if and only if
,
.
Ruye Wang
2018-03-26