Lab 7: Neural Networks
Due April 3 by midnight

4-layer neural network

Starting point code

Use Teammaker to form your team. You can log in to that site to indicate your partner preference. Once you and your partner have specified each other, a GitHub repository will be created for your team.

Introduction

There are many good sofware packages that you can use to experiment with neural networks, such as tensorflow, keras, or conx. However, before using these packages we want you to first understand how backpropagation works at a detailed level. Therefore, in this lab you will implement your own neural network from scratch.

Your neural network will be made up of a number of classes:

Neural Network Implementation

As you are working on your implementation, you should frequently try the unit tests we have provided:

  python3 nn_test.py
There are 16 unit tests in this file. At the very top of the output that is produced, are 16 characters summarizing your progress. An E indicates a runtime error, a F indicates a failure, and a . indicates a success.

Below is a short summary of the tasks you need to complete. Note that much more detail about all of these methods can be found in the comments within the file neural_net.py.

  1. Initially you will focus on implementing the SigmoidNode class methods:

    The first three functions store their results in the node's self.activation and self.delta fields. update_weights changes the weights of the edges stored in self.in_edges.

  2. Next you will work on completing the Network class. It is already set up to initialize a neural network with input, bias, and sigmoid nodes. You must implement the following functions for the neural network:

The main function sets up a network with a single two-neuron hidden layer and trains it to represent the function XOR.

Once you can successfully learn XOR, move on to the next section.

Explaining your XOR network

In the latex document provided (called nn.tex), fill in the diagram of the XOR network with the specific weights that were found by your program. Then explain how your network has solved the XOR problem. In particular, you need to clearly articulate when each hidden node will be active, and how the activity of the hidden nodes combined with the weights from the hidden nodes to the output nodes produces the correct output activations.

To compile your latex document into a pdf, run the following command:

pdflatex nn.tex
You can then open the pdf from the command line with the command:
evince nn.pdf

Be sure to include your name(s) in the title of the document.

Submitting

You should submit the files neural_net.py and nn.tex. As usual, use git add, git commit and git push before the deadline to submit your work.