Donald Hebb (1949) speculated that “When neuron A repeatedly and persistently takes part in exciting neuron B, the synaptic connection from A to B will be strengthened.” In other words, simultaneous activation of neurons leads to pronounced increases in synaptic strength between them, or “neurons that fire together wire together; neurons that fire out of sync, fail to link".
For example, the well known classical conditioning (Pavlov, 1927) could be explained by Hebbian learning. Consider the following three patterns (see here):
Based on this theory of Hebbian learning, the Hebbian network
can be considered as a supervised learning method that learns to
establish the associative relationship between any pair of two
patterns and
in the given dataset
and
,
considered as the training set.
This is a 2-layer network with
nodes in the input layer to
receive an input pattern
and
nodes
in the output layer to produce an output
.
Each output node is fully connected to all
input nodes through its
weights:
(11) |
(12) |
The Hebbian learning rule is inspired by Hebb's theory, i.e.,
when both neurons and
are activated, the synaptic
connectivity, here the weight
, between them is enhanced:
(13) |
(14) |
As in all supervised learning, the Hebbian network is first trained and then used for association.
For simplicity, we assume all weights are initialized to zero
, and then train
the network to find all weights based on all
pattern pairs
in the dataset
based on the learning law:
(15) |
(16) |
When one of the patterns is presented to the network,
it produces an output:
(17) |
(18) |
Under these conditions, the output of the network as its response
to input is
(19) |