วันศุกร์ที่ 19 มีนาคม พ.ศ. 2553

Single-Neuron Perceptron

The single-neuron perceptron is about finding a decision boundary. This method uses supervised learning rule to determine the decision boundary and the form of supervised learning is:

{p1, t1}, {p2, t2}, ..., {pQ, tQ}

where pQ is an input and tQ is the corresponding correct output (target).


The figure below is the model of single-neuron perceptron with two inputs.













After we determine the a, we have to find a new weight and these are the equations to find it:









Let's see this example.

For the logic function AND gate, the input & target pairs are:




The initial weight is all zeros (wT = [0 0]) and there is no bias (b = 0).

Solution
1st iteration: use {p1, t1}

a = hardlim([0 0][0 0]T) = 1
e = t - a = t1 - a = -1
wnew = [0 0] - [0 0] = [0 0]
bnew = bold + e = -1

2nd iteration: use {p2, t2}

a = hardlim([0 0][0 1]T- 1) = 0
e = t - a = t2 - a = 0
wnew = [0 0]
bnew = bold + e = -1

3rd iteration: use {p3, t3}

a = hardlim([0 0][1 0]T- 2) = 0
e = t - a = t3 - a = 0
wnew = [0 0]
bnew = bold + e = -1

4th iteration: use {p4, t4}

a = hardlim([0 0][1 1]T- 2) = 0
e = t - a = t4 - a = 1
wnew = [0 0] + [1 1] = [1 1]
bnew = bold + e = 0

5th iteration: use {p1, t1}

a = hardlim([1 1][0 0]T) = 1
e = t - a = t1 - a = -1
wnew = [1 1]-[0 0] = [1 1]
bnew = bold + e = -1

6th iteration: use {p2, t2}

a = hardlim([1 1][0 1]T - 1) = 1
e = t - a = t2 - a = -1
wnew = [1 1] - [0 1] = [1 0]
bnew = bold + e = -2

7th iteration: use {p3, t3}

a = hardlim([1 0][1 0]T - 2) = 0
e = t - a = t3 - a = 0
wnew = [1 0]
bnew = bold + e = -2

8th iteration: use {p4, t4}

a = hardlim([1 0][1 1]T- 2) = 0
e = t - a = t4 - a = 1 - 0 = 1
wnew = [1 0] + [1 1] = [2 1]
bnew = bold + e = -2 + 1 = -1

9th iteration: use {p1, t1}

a = hardlim([2 1][0 0]T- 1)= 0
e = t - a = t1 - a = 0
wnew = [2 1]
bnew = bold + e = -1

10th iteration: use {p2, t2}

a = hardlim([2 1][0 1]T- 1) = 0
e = t - a = t2 - a = 0
wnew = [2 1]
bnew = bold + e = -1

11th iteration: use {p3, t3}

a = hardlim([2 1][1 0]T- 1) = 1
e = t - a = t3 - a = -1
wnew = [2 1]-[1 0] = [1 1]
bnew = bold + e = -2

12th iteration: use {p4, t4}

a = hardlim([1 1][1 1]T- 2) = 1
e = t - a = t4 - a = 1 - 1 = 0
wnew = [1 1]
bnew = bold + e = -2

13th iteration: use {p1, t1}

a = hardlim([1 1][0 0]T- 1) = 0
e = t - a = t1 - a = 0
wnew = [1 1]
bnew = bold + e = -2

14th iteration: use {p2, t2}

a = hardlim([1 1][0 1]T- 1) = 1
e = t - a = t2 - a = -1
wnew = [1 1] - [0 1] = [1 0]
bnew = bold + e = -3

15th iteration: use {p3, t3}

a = hardlim([1 0][1 0]T- 3) = 0
e = t - a = t3 - a = 0
wnew = [1 0]
bnew = bold + e = -3

16th iteration: use {p4, t4}

a = hardlim([1 0][1 1]T- 3) = 0
e = t - a = t4 - a = 1
wnew = [1 0] + [1 1] = [2 1]
bnew = bold + e = -2

17th iteration: use {p1, t1}

a = hardlim([2 1][0 0]T- 2) = 0
e = t - a = t1 - a = 0
wnew = [2 1]
bnew = bold + e = -2

18th iteration: use {p2, t2}

a = hardlim([2 1][0 1]T- 2) = 0
e = t - a = t2 - a = 0
wnew = [2 1]
bnew = bold + e = -2

19th iteration: use {p3, t3}

a = hardlim([2 1][1 0]T- 2) = 1
e = t - a = t3 - a = 1
wnew = [2 1] + [1 0] = [3 1]
bnew = bold + e = -1

20th iteration: use {p4, t4}

a = hardlim([3 1][1 1]T- 1) = 1
e = t - a = t4 - a = 0
wnew = [3 1]
bnew = bold + e = -1

21st iteration: use {p1, t1}

a = hardlim([3 1][0 0]T- 1) = 0
e = t - a = t1 - a = 0
wnew = [3 1]
bnew = bold + e = -1

22nd iteration: use {p2, t2}

a = hardlim([3 1][0 1]T- 1) = 1
e = t - a = t2 - a = -1
wnew = [3 1] - [0 1] = [3 0]
bnew = bold + e = -2

23rd iteration: use {p3, t3}

a = hardlim([3 0][1 0]T- 2) = 1
e = t - a = t3 - a = -1
wnew = [3 0] - [1 0] = [2 0]
bnew = bold + e = -3


24th iteration: use {p4, t4}

a = hardlim([2 0][1 1]T- 3) = 0
e = t - a = t4 - a = 1
wnew = [2 0] + [1 1] = [3 1]
bnew = bold + e = -2

25th iteration: use {p1, t1}

a = hardlim([3 1][0 0]T- 2) = 0
e = t - a = t1 - a = 0
wnew = [3 1]
bnew = bold + e = -2

26th iteration: use {p2, t2}

a = hardlim([3 1][0 1]T- 2) = 0
e = t - a = t2 - a = 0
wnew = [3 1]
bnew = bold + e = -2

27th iteration: use {p3, t3}

a = hardlim([3 1][1 0]T- 2) = 1
e = t - a = t3 - a = -1
wnew = [3 1] - [1 0] = [2 1]
bnew = bold + e = -3

28th iteration: use {p4, t4}

a = hardlim([2 1][1 1]T- 3) = 1
e = t - a = t4 - a = 0
wnew = [2 1]
bnew = bold + e = -3

29th iteration: use {p1, t1}

a = hardlim([2 1][0 0]T- 3) = 0
e = t - a = t1 - a = 0
wnew = [2 1]
bnew = bold + e = -3

30th iteration: use {p2, t2}

a = hardlim([2 1][0 1]T- 3) = 0
e = t - a = t2 - a = 0
wnew = [2 1]
bnew = bold + e = -3

31st iteration: use {p3, t3}

a = hardlim([2 1][1 0]T- 3) = 0
e = t - a = t3 - a = 0
wnew = [2 1]
bnew = bold + e = -3

32nd iteration: use {p4, t3}

a = hardlim([2 1][1 1]T- 3) = 1
e = t - a = t4 - a = 0
wnew = [2 1]
bnew = bold + e = -3

Thus, w = [2 1] and b = -3.

Let's solve the question using MATLAB
>> p = [0 0 1 1; 0 1 0 1];
>> t = [0 0 0 1];
>> plotpv(p, t)





















>> sample = newp(minmax(p), 1)
>> sample.trainParam.epochs = 20;
>> sample = train(sample, p, t)
>> y = sim(sample, p)
>> plotpv(p, y)
>> sample.iw{1,1}

Answer [2 1]

>> sample.b{1}

Answer -3

>> plotpc(sample.iw{1,1}, sample.b{1})


วันศุกร์ที่ 12 มีนาคม พ.ศ. 2553

Introduction and Single-Input Neuron

A neural network is an adaptive system that can change its model by learning external and/or internal data. There are many applications from using a neural network:
  • Pattern Recognition, such as image and speech.
  • Function Estimation, such as quadratic function and time series prediction.
  • Classification, such as pattern and sequence recognition.
  • etc.

Single-Input Neuron

This is the simplest model of a neural network.




















f is a transfer function. This is the list of transfer functions which are used frequently:
Name: Hard Limit (hardlim)
Input/Output Relation:
a = 0 when n <.0.
a = 1 when n 0

Name: Saturating Linear (satlin)
Input/Output Relation:

a = 0 when n <.0.
a = n when 0 n 1
a = 1 when n > 1

Name: Symmetric Saturating Linear (satlins)
Input/Output Relation:

a = -1 when n < -1
a = n when -1 n 1
a = 1 when n > 1

Name: Log-Sigmoid (logsig)
Input/Output Relation:






Name: Hyperbolic Tangent Sigmoid (tansig)
Input/Output Relation: