E27: Computer Vision - Spring 2016 - HOMEWORK 12
Backpropagation in a simple network
The binary XOR function is given by the following truth table:
y1
|
y2
|
y1 ⊗ y2
|
-1
|
-1
|
-1
|
-1
|
1
|
1
|
1
|
-1
|
1
|
1
|
1
|
-1
|
It can be computed by a simple two-layer network with the following structure:
Note that both the input and hidden layer of the network have bias nodes. The output is computed by
y3 = f(x3) = f(w03 + w13 y1 + w23 y2)
y4 = f(x4) = f(w04 + w14 y1 + w24 y2)
y5 = f(x5) = f(w05 + w35 y3 + w45 y4)
where wij is the weight from node i to node j and w0j is the weight from a bias node.
Download the xor_nnet.py file from the course webpage and complete the steps marked # TODO to implement the back-propagation algorithm described in the class handout for this simple network. If you keep the α parameter and tolerance specified in the file, your implementation should converge in 61 iterations.
Attachment:- xor_nnet.py.rar