MATH7502 Final Project
Your final project is to be submitted individually. However, each project may have up to two
named advisors (these are students in the course). A minimum of one advisor per project is
recommended. These advisors are to give you tips and feedback on your project dealing with
the mathematical content, the software, the presentation, and the use of the English language.
It is recommended that in addition to working on your project, you serve as an advisor to one
or two other projects (you may advise a maximum of three projects). If you are strong in A and
need help with B, seek advisors in B and offer your advise with A. You may find advisors using
the Blackboard discussion board.
Plagiarism will not be tolerated. There is a clear difference between receiving advise and copying
the work of others. Advisors can be great at catching mistakes in mathematics, coding style, or
English.
Your project submission should have demonstrations for 3 out of these 6 topics appearing in
[DSUC] (see the pdf file):
• Topic 2 (perceptron).
• Topic 4 (classification).
• Topic 5 (multi-objective and regularization).
• Topic 9 (Gaussians weighted least squares).
• Topic 11 (gradient descent).
• Topic 12 (PCA).
Choose any three topics of your liking. For each topic, create illustrative demonstrations of
the topic, making references to concepts from linear algebra as they arise. See below a list of
“questions” that need to be answered with your demonstrations.
For each topic you should carry out the following:
1. Create a formatted one page summary of the content, highlighting the main methods,
tools, results and applications. This needs to be a brief and sharp write-up accessible to
other students in the course that haven’t studied the specific topic. A good summary will
possess qualities similar to (part-of) a good Wiki summary. It is to be handed in as a
single A4 PDF page. Including formulas and images is encouraged.
2. Create a pdf with Matlab code with one to three code demonstrations of the results and
methods of the topic. The pdf file should also have some equations where appropriate.
This file should be a demonstration of the concepts learned.
Combine all of your files into a single pdf file ready for upload.
1 of 3
MATH7502 Final Project
In your hand-in make sure that the document you created has your name on it. Your hand-in
should be in a single upload to the course blackboard site. The content of the hand-in e-mail
should also specify which topics you chose to work on. The file uploaded should contain:
• The three summaries for (1) above.
• The three code files for (2) above.
Projects will be marked based on the following criteria:
1. 5% - for following instructions. You either get this mark, or loose 5% if you deviated from
the hand-in instructions.
2. 20% - visual presentation. You get 20% if the PDF file appears clean, visually pleasant,
graphs are labelled, images are not pixelated, text and Matlab code is properly formatted
and appears cleanly.
3. For each of the three topics, 15% for the PDF summary (full marks for a crisp and precise
summary that answers the “questions” without flaws), and 10% for the Matlab pdf file
(full marks if the choice of content is sensible and the code examples work, make sense,
and answer the “questions” sensibly.).
2 of 3
MATH7502 Final Project
So what are the “questions”? Most of the “questions” are very straight-forward and only deal
with demonstrating the topics at hand:
• Topic 2 (perceptron):
Q: How does the perceptron work?
Q: What is the convergence theorem for the perceptron? That is, how do you prove the
theorem?
• Topic 4 (classification).
Q: How does least-squares classification work?
• Topic 5 (multi-objective and regularization).
Q: What is the general problem of multi-objective regularization?
Q: What is ridge-regression and how is it a special-case?
Q: What are some ways to choose the regularization parameter for ridge regression?
• Topic 9 (Gaussians).
Q: What is the multi-variate Gaussian distribution?
Q: How is the bivariate distribution a special case?
Q: How to carry out computations with Gaussian distributions?
Q: How to generate random variables from Gaussian distributions using Cholesky factorizations.
• Topic 11 (gradient descent).
Q: How does the basic gradient descent algorithm work?
Q: What is a demonstration of simple examples where the algorithm performs badly?
Q: What modifcations exist and how do they work?
• Topic 12 (PCA).
Q: How does PCA work?
Q: What are some applications of PCA and how do they work?
The [DSUC] pdf contains links to material for each of the topics.
3 of 3