next up previous
Next: About this document ... Up: Neural Signaling III - Previous: Neural coding of information

Gaussian channel

Assume the stimulus is draw from a Gaussian distribution with variance $\sigma_s^2$ and the response is a linear function of the stimulus:

x=gs+n

where g is the gain and n is additive noise also normally distributed with variance $\sigma_n^2$, then the mutual information gained by observing x is

\begin{displaymath}I=\frac{1}{2} log (1+\frac{\sigma_s^2}{\sigma_n^2 g^2})=
\frac{1}{2} log (1+SNR) \end{displaymath}

where SNR is the signal-to-noise ratio defined as the signal variance to the effective noise variance. If the stimulus is not Gaussian (but of the same variance), according to the maximum entropy property of Gaussian, the mutual information is less

\begin{displaymath}I < \frac{1}{2} log (1+SNR) \end{displaymath}



Ruye Wang
1999-09-12