next up previous
Next: Newton-Raphson Method (Uni-Variate) Up: Appendix Previous: Mutual information

Functions of random variables

Assume $X$ is a random variable with distribution $p_x(x)$, then its function $Y=\phi(X)$ is also a random variable. We have $X=\phi^{-1}(Y)=\psi(Y)$ and $dy/dx=\phi'$ and $dx/dy=\psi'=1/\phi'$.

If the inverse function $X=\psi^{-1}(Y)$ is not unique, than

\begin{displaymath}H(Y)<H(X)+E\; \{log\;\vert\phi'(x)\vert\} \end{displaymath}

This result can be generalized to multi-variables. If

\begin{displaymath}Y_i=\phi_i(X_1,\cdots,X_n),\;\;\;\;\;(i=1,\cdots,n) \end{displaymath}

then

\begin{displaymath}H(Y_1,\cdots,Y_n) \le H(X_1,\cdots,X_n)+E\;\{ log\;J(X_1,\cdots,X_n)\} \end{displaymath}

where $J(X_1,\cdots,X_n)$ is the Jacobian of the above transformation:

\begin{displaymath}J(X_1,\cdots,X_n)=\left\vert \begin{array}{ccc}
\frac{\parti...
... &\frac{\partial \phi_n}{\partial X_n}
\end{array} \right\vert \end{displaymath}

In particular, if the functions are linear

\begin{displaymath}Y_i=\sum_{j=1}^n a_{ij} X_j,\;\;\;\;\;(i=1,\cdots,n) \end{displaymath}

then

\begin{displaymath}H(Y_1,\cdots,Y_n) \le H(X_1,\cdots,X_n)+log\;det(A) \end{displaymath}

where $det(A)$ is the determinant of the transform matrix $A=[a_{ij}]_{n\times n}$. Again, the equation holds if the transform is unique.


next up previous
Next: Newton-Raphson Method (Uni-Variate) Up: Appendix Previous: Mutual information
Ruye Wang 2018-03-26