COMP 4107: Neural Networks Winter 2021
Page 1 of 2
Assignment 4
This assignment may be completed individually or in groups of 2 or 3.
You are recommended to use your project groups. If you are in a group, one student will
submit all necessary files and the other student(s) will submit a text file specifying members of
the group and who is submitting. The report must have all students’ names and IDs.
In this assignment, you will develop implementations for self-organizing maps and Hopfield
neural networks for the handwritten digit recognition problem.
Description
You may use any and all functionalities found in scikit-learn and tensorflow. You may find Kmeans
on MNIST useful too.
Note
In any K-fold experimentation performed ensure that you document mean and standard
deviation of performance measures obtained (e.g., accuracy).
Question 1
[30 marks]
Using the scikit-learn utilities to load the MNIST data, implement a Hopfield network that can
classify the image data for a subset of the handwritten digits. Subsample the data to only
include images of '1' and '5'. Here, correct classification means that if we present an image of a
'1' an image of a '1' will be recovered; however, it may not be the original image owing to the
degenerate property of this type of network. You are expected to document classification
accuracy as a function of the number of images used to train the network. Remember, a
Hopfield network can only store approximately 0.15N patterns with the "one shot" learning
described in Lecture 13.
Question 2
[30 marks]
Develop a feed forward RBF neural network in python that classifies the complete set of images
found in the MNIST dataset. You are to train your neural network using backpropagation. You
should use gaussian functions as your radial basis functions. You must show that you have:
1. Used K-means to design the hidden layer in your network. You may use any existing
code for running K-means (you do not need to code your own), but you must cite your
sources in the report.
2. Performed K-fold cross correlation.
3. Investigated the performance of your neural network for different sizes of hidden layer.
COMP 4107: Neural Networks Winter 2021
Page 2 of 2
4. Investigated the performance of your neural network when using dropout in the hidden
layer. A paper on dropout is here.
Question 3
[30 marks]
We can use self organizing maps as a substitute for K-means.
In Question 2, K-means was used to compute the number of hidden layer neurons to be used in
an RBF network. Using a 2D self-organizing map compare the clusters when compared to Kmeans
for the MNIST data. Sample the data to include only images of '1' and '5'. Use the
scikit-learn utilities to load the data. You are expected to (a) document the dimensions of the
SOM computed and the learning parameters used to generate it (b) provide 2D plots of the
regions for '1' and '5' for both the SOM and K-means solutions. You may project your K-means
data using SVD to 2 dimensions for display purposes.