A data point in a d-dimensional space is represented by a vector
, of which the components are the
coordinates along the
standard orthonomal basis vectors
that spann the space:
(44) |
The space can also be spanned by any other orthonormal basis
satisfying
(45) |
(46) |
(47) |
The basis vectors in
can
be considered as a rotated version of the standard basis in
, and the norm or length
of
before and and
after the transform remain
the same:
(48) |
Summarizing the above, we can define an orthogonal transform based on
any orthogonal matrix :
(49) |
Any orthogonal transform
is actually a
rotation of the standard basis
into
another orthonormal basis
spanning
the same space, while
and
are just the coordinates
or coefficients of the same vector under these two different coordinate
systems.
If is treated as a random vector, then the linear
transform
, is also a random vector,
and its mean vector and covariance can be found as:
(50) |
In particular, the Karhunen-Loeve Transform (KLT) is just
one of such orthogonal transforms in the form of
, where the orthogonal transform matrix
is the eigenvector matrix
of the covariance matrix
of
, composed of
the
normalized eigenvectors of
. As in general
the covariance matrix
is symmetric and positive
definite, its eigenvalues
are real
and positive, and its eigenvectors are orthogonal, i.e., its
eigenvector matrix
is indeed an orthogonal matrix
satisfying
or
.
The eigenvalues
and the
corresponding eigenvectors
can
then be found by solving the eigenequations:
(51) |
(52) |
(53) |
(54) |
Based on the orthogonal eigenvector matrix , the
KLT is defined as:
(56) |
Example:
(57) |
(58) |
(59) |