Definition: A vector space is a set with two operations of
addition and scalar multiplication defined for its members, referred to
as vectors.
Definition:
An inner product on a vector space is a function that maps two vectors
to a scalar
and satisfies the following conditions:
![]() |
|||
![]() |
![]() |
Definition: A vector space with inner product defined is called an inner product space.
Definition: When the inner product is defined, is called
a unitary space and
is called a Euclidean space.
Examples
The concept of inner product is of essential importance based on which a whole set of other important concepts can be defined.
Definition:
If the inner product of two vectors and
is zero,
, they are orthogonal (perpendicular) to
each other, denoted by
.
Definition:
The norm (or length) of a vector is defined as
The norm is non-negative and it is zero if and only if
. In particular, if
, then it is said
to be normalized and becomes a unit vector. Any vector can
be normalized when divided by its own norm:
. The
vector norm squared
can be
considered as the energy of the vector.
Example: In an -D unitary space, the p-norm of a vector
is
The concept of -D unitary (or Euclidean) space can be generalized to an
infinite-dimensional space, in which case the range of the summation will
cover all real integers
in the entire real axis
.
This norm exists only if the summation converges to a finite value; i.e., the
vector
is an energy signal with finite energy:
Similarly, in a function space, the norm of a function vector
is defined as
Definition:
In a unitary space , the p-norm distance between two
vectors
and
is defined as the p-norm of the difference
:
Definition:
A vector space V of all linear combinations of a set of vectors
is called the linear span of the vectors:
Definition: A set of linearly independent vectors that spans a vector space is called a basis of the space. It these vectors are unitary/orthogonal and normalized, they form an orthonormal basis.
Any vector
can be uniquely expressed as a linear
combination of some
basis vectors
:
Theorem:
Let and
be any two vectors in a vector space
spanned
by a set of complete orthonormal (orthogonal and normalized) basis vectors
satisfying
Example: Space can be spanned by
orthonormal
vectors
, where the
th basis vector is
, that satisfy:
Any vector
can be expressed as
This two equations can be rewritten as a pair of transforms:
The second equation in the transform pair can also be written in component form as
Example: In space composed of all square-integrable
functions defined over
, spanned by a set of orthonormal basis
functions
satisfying:
The Fourier transforms
Consider the following four Fourier bases that span four different types of vector spaces for signals that are either continuous or discrete, of finite or infinite duration.
Any vector
in this space can be expressed as
Definition:
A linear transformation
is a unitary transformation
if it conserves inner products:
A unitary transformation also conserves any measurement based on the
inner product, such as the norm of a vector, the distance and angle
between two vectors, and the projection of one vector on another. Also,
if in particular
, we have
Definition
A matrix is unitary if it conserves inner products:
Theorem:
A matrix is unitary if and only if
; i.e.,
the following two statements are equivalent:
![]() |
![]() |
||
![]() |
![]() |
A unitary matrix has the following properties:
The identity matrix
is a special
orthogonal matrix as its columns (or rows) are orthonormal:
This result can be extended to the continuous transformation for signal vectors
in the form of continuous functions. In general, corresponding to any given
unitary transformation , a signal vector
can be alternatively
represented by a coefficient vector
(where
can be
either a set of discrete coefficients
or a continuous function
).
The original signal vector
can always be reconstructed from
by applying
on both sides of
to get
; i.e., we get a unitary transform pair in
the most general form: