next up previous
Next: Maximization of Entropy Up: Information Theory Previous: Measurement of uncertainty and

Entropy, uncertainty of random experiment

A random experiment may have binary (e.g., rain or dry) or multiple outcomes. For example, a dice has six possible outcomes with equal probability, or a pixel in a digital image takes one of the $2^8=256$ gray levels (from 0 to 255) with not necessarily the same probability. In general, these multiple outcomes can be considered as $N$ events $E_i$ with corresponding probability $P(E_i)=P_i$ ($i=1,\cdots,N$), which are

The uncertainty about the outcome of such a random experiment is the sum of the uncertainty $H(E_i)$ associated with each individual event $E_i$, weighted by the probability $P_i$ of the event:

\begin{displaymath}H(E_1, \cdots, E_N)\stackrel{\triangle}{=}\sum_{i=1}^N P_i\; H(E_i)
=-\sum_{i=1}^N P_i \;log\; P_i
\end{displaymath}

This is called the entropy which measures the uncertainty of, or the information contained in, the random experiment. If the logarithmic base is 2, the unit of entropy is the bit.

For example, the weather can have two possible outcomes: rain $E_1$ with probability $P_1$ or dry $E_2$ with probability $P_2=1-P_1$. The uncertainty of the weather is therefore the sum of the uncertainty of a rainy weather and the uncertainty of a fine weather weighted by their probabilities:

$\displaystyle H(E_1, E_2)$ $\textstyle =$ $\displaystyle P_1\; H(E_1)+P_2\; H(E_2)=-P_1 \;log\; P_1-P_2 \;log\; P_2$  
  $\textstyle =$ $\displaystyle -P_1\; log \;P_1-(1-P_1)\; log \;(1-P_1)$  

In particular, if $P_1=1$ and $P_2=0$ (or vice versa), we have (note that $x\; log \;x \vert _{x=0}=0$):

\begin{displaymath}H(E_1, E_2) = -1\; log_2 \;1-0\; log_2 \;0=0 \end{displaymath}

i.e., uncertainty becomes 0. But if $P_1=P_2=1/2$, we have

\begin{displaymath}H(E_1, E_2) = -\frac{1}{2}\; log_2 \;\frac{1}{2}-\frac{1}{2}\; log_2 \;\frac{1}{2}=1 \end{displaymath}

i.e., for this binary event, the maximum uncertainty is 1 bit.


next up previous
Next: Maximization of Entropy Up: Information Theory Previous: Measurement of uncertainty and
Ruye Wang 2009-12-10