next up previous
Next: Image Compression with Huffman Up: Information Theory Previous: Entropy, uncertainty of random

Maximization of Entropy

Unlike a single event $E$ (e.g., rainy weather) whose uncertainty is not bounded (when $P(E) \rightarrow 0$, $H(E)=-log\; P(E) \rightarrow \infty$), the uncertainty of a random experiment with finite outcomes is bounded by a maximum entropy, which can be obtained by finding $P_i=P(E_i)$ ( $i=1,\cdots,N$) so that the entropy

\begin{displaymath}H(E_1, \cdots, E_N)=-\sum_{i=1}^N P_i \;log \;P_i \end{displaymath}

is maximized with the constraint $\sum_{i=1}^N P_i=1$. This is a constrained maximization problem which can be solved by Lagrange multiplier method:
    $\displaystyle \frac{\partial}{\partial P_j} [-\sum_{i=1}^N P_i log \;P_i
+\lambda(\sum_{i=0}^NP_i-1)]=0$  
  $\textstyle =$ $\displaystyle -\frac{\partial}{\partial P_j} P_j \;log\; P_j +\lambda
= -(log\; P_j +1)+\lambda=0$  

Solving this we get

\begin{displaymath}P_j=2^{\lambda-1} \;\;\;\;\;(j=1,\cdots,N) \end{displaymath}

indicating that all $P_i$'s must be the same. But as they also have to add up to 1, we have

\begin{displaymath}P_1=P_2=\cdots=P_N=\frac{1}{N} \end{displaymath}

and the corresponding maximum entropy can be obtained as

\begin{displaymath}H_{max}(E_1,\cdots,E_N)=-\sum_{i=1}^N \frac{1}{N} log_2 \frac{1}{N}
=log_2 N
\end{displaymath}

This result that a random event with equally likely outcomes has maximum entropy makes intuitive sense, as the uncertainty is always reduced if some outcomes are more likely than others. In particular when $N=2$, we have $P_2=1-P_1$ and the entropy

\begin{displaymath}H(E_1, E_2)=-P_1\; log\; P_1-(1-P_1)\; log\; (1-P_1) \end{displaymath}

reaches maximum $H_{max}=1$ when the two outcomes are equally likely $P_1=P_2=0.5$. (This may be why a cow may starve to death when standing in between of two equally wonderful piles of hay.)

entropy.gif


next up previous
Next: Image Compression with Huffman Up: Information Theory Previous: Entropy, uncertainty of random
Ruye Wang 2021-03-28