首页 > > 详细

辅导CS 486、辅导Python讲解编程

CS 486/686 Assignment 3
Winter 2022
(135 marks)
Blake VanBerlo
Due Date: 11:59 PM ET on Wednesday, March 23, 2022
Changes
v1.1: Small changes to hyperparameters in Q2.2a and Q2.b. Fixed some typos.
v1.2: Fixed typos in function names
v1.3: Instructions to calculate average cross entropy loss
1
CS 486/686 Winter 2022 Assignment 3
Academic Integrity Statement
If your written submission on Learn does not include this academic integrity statement with
your signature (typed name), we will deduct 5 marks from your final assignment mark.
I declare the following statements to be true:
The work I submit here is entirely my own.
I have not shared and will not share any of my code with anyone at any point.
I have not posted and will not post my code on any public or private forum or website.
I have not discussed and will not discuss the contents of this assessment with anyone
at any point.
I have not posted and will not post the contents of this assessment and its solutions
on any public or private forum or website.
I will not search for assessment solutions online.
I am aware that misconduct related to assessments can result in significant penalties,
possibly including failure in the course and suspension. This is covered in Policy 71:
https://uwaterloo.ca/secretariat/policies-procedures-guidelines/policy-71.
Failure to accept the integrity policy will result in your assignment not being graded.
By typing or writing my full legal name below, I confirm that I have read and understood
the academic integrity statement above.
?Blake VanBerlo 2022 v1.3 Page 2 of 12
CS 486/686 Winter 2022 Assignment 3
Instructions
Submit any written solutions in a file named writeup.pdf to the A3 Dropbox on Learn.
If your written submission on Learn does not contain one file named writeup.pdf, we
will deduct 5 marks from your final assignment mark.
Submit any code to Marmoset at https://marmoset.student.cs.uwaterloo.ca/.
Grades for the programming component are determined by the unit tests in the “As-
signment 3 (Final)” project on Marmoset The “Assignment 3 (Week 1)” and “As-
signment 3 (Week 2)” projects contain ungraded public test suites meant to help you
debug, and they are only temporarily available.
No late assignment will be accepted. This assignment is to be done individually.
I strongly encourage you to complete your write-up in LaTeX, using this source file.
If you do, in your submission, please replace the author with your name and student
number. Please also remove the due date, the Instructions section, and the Learning
goals section. Thanks!
Lead TAs:
– Question 1: Xuejun Du (xuejun.du@uwaterloo.ca)
– Question 2: Sharhad Bashar (sharhad.bashar@uwaterloo.ca)
The TAs’ office hours will be scheduled on MS Teams.
Learning goals
Decision Trees
Compute the entropy of a probability distribution.
Trace the execution of the algorithm for learning a decision tree.
Determine valid splits for real-valued features.
Apply overfitting prevention strategies for decision trees.
Neural networks
Implement a multilayer perceptron
Implement the backpropagation algorithm including the forward and backward passes
Understand and interpret performance metrics in supervised learning
Blake VanBerlo 2022 v1.3 Page 3 of 12
CS 486/686 Winter 2022 Assignment 3
1 Decision Trees (35 marks)
1. Recall that the entropy of a probability distribution over k outcomes c1, c2, . . . , ck is:
I(P (c1), P (c2), . . . , P (ck)) = ?
k∑
i=1
P (ci) log2(P (ci))
In Lecture 13, we learned that the maximum entropy probability distribution over 2. Interestingly, the maximum entropy probability distribution over
n outcomes is the discrete uniform distribution: is the maximum entropy probability distribution over 3 outcomes.
Hint: Define a general distribution over 3 outcomes as ?p, q, 1?p?q?, where 0 ≤ p, q ≤ 1.
(8 marks) Proof is correct.
(2 marks) Proof is clear and easy to understand.
2. Suppose that the computer science department would like to predict whether a student
will pass CS486. Professor X thinks that the following features may be sufficient to
predict success in CS486:
Calc: Whether or not the student took an introductory calculus course in the
past. The domain is {true, false}.
Algo: Whether or not the student took an algorithms and data structures course.
The domain is {true, false}.
Python: Whether or not the student has experience programming in Python. The
domain is {true, false}.
Avg : The student’s grade average in the semester before taking CS486 (expressed
as a percentage).
The target variable is whether or not the student passed CS486 (i.e. ”Passed”) and its
domain is {true, false}.
Professor X would like to fit a decision tree to predict whether a student will pass
CS486. They extract information from 20 students who took CS486 in the past, com-
prising the training set1, given below.
1This dataset is fictional.
Blake VanBerlo 2022 v1.3 Page 4 of 12
CS 486/686 Winter 2022 Assignment 3
Student Calc Algo Python Avg Passed
1 false false false 54.8 false
2 true true false 63.4 false
3 true false false 68.7 false
4 true true false 71.3 true
5 false true false 73.2 true
6 true false false 73.6 false
7 true false true 75.4 false
8 false true true 80.0 true
9 false true true 80.5 true
10 true true false 84.5 true
11 true true true 85.6 true
12 false true true 86.0 true
13 true true true 88.4 true
14 false true true 89.0 true
15 false true true 91.0 true
16 true true false 92.2 true
17 true true false 93.0 true
18 true true true 93.2 true
19 true true true 95.3 true
20 true true true 99.0 true
Training Set
Using the decision tree learner algorithm presented in class, fit a decision tree for this
training set. Select the next feature to test based on maximum information gain of the
remaining features.
Remember that discrete features can only appear once on any path in the tree. For
real-valued features, consider each possible split as a binary feature. For example, if
you have already tested Calc and Python on the current branch, then find the split
points for Avg for the examples in the branch. If the candidate split points for Avg
are x and y, then you must test the following 3 features: Algo, Avg ≥ x, and Avg ≥ y.
Show all of your steps and calculations. You may round all of your calculations to 4
decimal points. You may also abbreviate entropy calculations with I(p0, p1), where p0
and p1 are probabilities. Be sure to clearly show your final decision tree.

联系我们 - QQ: 99515681 微信:codinghelp
© 2021 www.7daixie.com
程序辅导网!