next up previous
Next: Entropy, uncertainty of random Up: Information Theory Previous: Probability, uncertainty and information

Measurement of uncertainty and information

In case of multiple events, such as $N$ independent events $E_i$ $(i=1,\cdots,N)$, the associated uncertainty $H(E_1, \cdots, E_N)$ is related to the joint probability of the occurrence of all of the events, which is the product of their individual probabilities $P(E_i)$:

\begin{displaymath}P(E_1, \cdots, E_N)=P(E_1) \cdots P(E_N)=\prod_{i=1}^NP(E_i) \end{displaymath}

However it is desirable (and intuitively makes sense) for the total uncertainty $H(E_1, \cdots, E_N)$ to be the sum of the individual uncertainties:

\begin{displaymath}H_{total}=H_(E_1)+\cdots+H(E_N)=\sum_{i=1}^N H(E_i) \end{displaymath}

Summarizing the above, we see that the uncertainty $H(E)$ associated with an event $E$ with probability $P(E)$ should satisfy these constraints:

The uncertainty, also called surprise, $H(E)$ of an event $E$ is therefore defined as:

\begin{displaymath}H(E)\stackrel{\triangle}{=}log\; \frac{1}{P(E)}=-log\; P(E) \end{displaymath}

When $P(E)=1$, e.g., the sun will rise tomorrow, the surprise is 0, when $P(E)=0$, e.g., the sun will rise in the west, the surprise is $\infty$. Also this definition obviously satisfies the first two constraints. In multi-event case we have
$\displaystyle H(E_1, E_2)$ $\textstyle =$ $\displaystyle -log\; P(E_1,E_2)=-log\; P(E_1) P(E_2)$  
  $\textstyle =$ $\displaystyle -log\; P(E_1)-log \;(E_2)=H(E_1)+H(E_2)$  

i.e., the uncertainties are additive as required by the third constraint.

Note that for an impossible event $E$ with $P(E)=0$, the uncertainty $H_{before}(E)=-log\; 0=\infty$. If you are told such an impossible event could actually occur ( $H_{after}(E)<\infty$) or did occur ( $H_{after}(E)=0$), infinite amount of information would be gained:

\begin{displaymath}I=H_{before}-H_{after}=\infty \end{displaymath}

So you will get no information if some one tells you the sun will rise tomorrow morning ( $P(E)=1 \Rightarrow I(E)=0$), but you will get infinite amount of information if someone tells you the sun will rise in the west ( $P(E)=0 \Rightarrow I(E)=\infty$).


next up previous
Next: Entropy, uncertainty of random Up: Information Theory Previous: Probability, uncertainty and information
Ruye Wang 2021-03-28