next up previous
Next: Measurement of uncertainty and Up: Information Theory Previous: Information Theory

Probability, uncertainty and information

Information about a certain event $E$ gained during a process (e.g., receiving a message) is defined as the reduction of uncertainty about $E$. If $H_{before}(E)$ and $H_{after}(E)$ are, respectively, the uncertainty before and after the process, then the amount of information gained is

\begin{displaymath}I(E)\stackrel{\triangle}{=}H_{before}(E)-H_{after}(E) \end{displaymath}

Uncertainty is obviously related to the probability of the event. If $E$ is the event that it will rain tomorrow, we have a low uncertainty $H(E)$ about this event in Seattle where it rains a lot (high $P(E)$), but a high uncertainty $H(E)$ in Los Angeles where the weather is usually dry (low $P(E)$). For this reason, uncertainty is also called surprise, i.e., rain is more of a surprise in Los Angeles than in Seattle.

If the weather forecast reports (correctly!) that it will rain tomorrow ($P(E)$ is increased to 1 to indicate $E$ will occur), the uncertainty $H_{after}(E)$ is reduced to 0 and some information $I(E)$ is gained:

\begin{displaymath}I(E)=H_{before}(E)-H_{after}(E)=H_{before}(E)-0=H_{before}(E) \end{displaymath}

If the weather forecast reports a $90\%$ chance of rain ($P(E)=0.9$), we may still get some information so long as $P(E)$ is increased and thereby the uncertainty $H_{after}(E)>0$ is reduced from $H_{before}(E)$.

In general, the uncertainty $H(E)$ of an event $E$ is small if its probability $P(E)$ is large, and vice versa. In particular, when $P(E)=1$, the corresponding uncertainty $H(E)=0$.


next up previous
Next: Measurement of uncertainty and Up: Information Theory Previous: Information Theory
Ruye Wang 2021-03-28