วันจันทร์ที่ 12 เมษายน พ.ศ. 2553

Supervised Hebbian Learning

- Linear Associator














The linear associator is a neural network for memorizing and we can call it "an associative memory". When the network receives an input (e.g. p = pn), it should produce an output (e.g. a = tn where n = 1, 2, …, n).

Pseudoinverse Rule








Example

If there are four numbers (0-3) and they are displayed in 5x5 pixels, design a network that can recognize the numbers.








Sol

Find the input p (0 for white pixel and 1 for black pixel)

p0 = [ 0, 1, 1, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 1, 1, 0]

p1 = [ 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0]

p2 = [ 0, 1, 1, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 1, 1, 1, 1, 1]

p3 = [ 0, 1, 1, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 1, 0, 1, 1, 1, 0]





























Next, find the weight W from these equations:












Check the network

For t = 0

a = W x p0’ = 0

For t = 1

a = W x p1’ = 1

For t = 2

a = W x p2’ = 2

For t = 3

a = W x p3’ = 3

Thus, a linear associator can be used to recognize the patterns or numbers of inputs but there is a drawback. If the input has an undesirable pixel, for example:










The figure 2 looks like a number ‘1’ but we let our network recognize the number from the figure 1. Therefore, the output of the network getting an input from the figure 2 will not be a number ‘1’ or the network will show error. This is because a linear associator is used to recognize only. It uses memories they have recognized and compares with a new input to determine an output.