Next: Nonlinar Regression
Up: Gaussiaon Process
Previous: Gaussiaon Process
In a general regression problem, the observed data (training data) are
where
is a set of
input
vectors of dimensionality of
, and
is the corresponding output
scalar assumed to be generated by some underlying processing described by
a function
with addititive noise, i.e.,
The goal of the regression is to infer the function
based on
,
and make prediction of the output
when the input is
.
The simplest form of regression is linear regression based on the assumption
that the underlying function
is a linear combination of all
components of the input vector with weights
:
This can be expressed in matrix form for all
data points:
where
is an
matrix whose
nth column is for the input vector
and
is an
N-dimensional vector for the
output values. In general,
and the
linear regression can be solved by least-squares method to get
where
is the psudo inverse of
matrix
.
Alternatively, the regression problem can be viewed as a Bayesian inference
process. We can assume both the model parameters and the noise are normally
distributed:
i.e., the noise
in the
different data points is independent.
The likelihood of the model parameters
given the data
is
According to Bayesian theorem, the posterior of the parameters is proportional
to the product of the likelihood and the prior:
where
The predictive distribution of
given
is the average over all
possible parameter values weighted by their posterior probability:
Next: Nonlinar Regression
Up: Gaussiaon Process
Previous: Gaussiaon Process
Ruye Wang
2006-11-14