首页 >
> 详细

Optimization in Machine Learning (2020 Winter)

Assignment 2

Instructions: For the code parts, submit the completed files on Canvas (both .py file

and .ipynb file are accepted). For the free response parts, type your solution into a seperate

electronic document (.pdf file). Physical submissions will NOT be accepted.

To submit, compress all your files into a single compressed file (.zip file). E-mail a softcopy

of your code and answers to rkwon@mie.utoronto.ca.

If you have any questions about this assignment, please e-mail yhe@mie.utoronto.ca.

1. Linear Support Vector Machine

Download the dataset ‘prob1data.csv’. The dataset consists of two features as first two

columns and its classification as the third column (0 and 1 refer to two distinct types.) A

plot of the dataset is given below:

In this problem, you will solve both primal and dual optimization problem of the linear

soft-margin support vector machine using CVXPY and analyze the results.

(1a) (Code + Free Response) Complete the function ‘LinearSVM Primal0

that solve the

primal optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the

optimization problem and report:

(1) The optimal decision boundary.

(2) The optimal support vectors.

(3) The solution time.

1

(1b) (Code + Free Response) Complete the function ‘LinearSVM Dual’ that solve the dual

optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the optimization problem and report:

(1) The optimal dual solution.

(2) The optimal decision boundary.

(3) The optimal support vectors.

(4) The solution time.

(1c) (Free Response) Discuss if the decision boundary of the linear SVM will change with

increased and decreased C value. If the decision boundary changes, briefly discuss how it

changes with C and why. If the decision boundary does not change with C, discuss the reason.

(1d) (Code + Free Response) Complete the function ‘Linearly separable’ that output 1 if

the dataset is linearly separable and 0 otherwise. Determine if the given dataset is linearly

separable. For any given dataset with multiple features, how can one conclude if the dataset

is linearly separable based on the optimal solution (optimal decision boundary) and optimal

objective function value solved? (Hint: consider varying C values.)

In this following problems, we will consider an alternative soft-margin method, known as the

l2 norm soft margin SVM. This new algorithm is given by the following primal optimization

problem (notice that the slack penalties are now squared, n is the total number of datapoints,

(2a) (Code) Complete the function ‘gaussian kernel sigma’ that returns a function ‘gaussian kernel’

with the specified σ value.

(2b) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ and ‘gaussian kernel sigma’

coded in (2a) to build a kernel SVM to classify the train data X train. Use C = 1 and

σ = 0.1. Report:

(1) Number of support vectors.

(2) Prediction error (ratio) in test set X test.

(3) Plot decision boundary approximately.

Download the files ‘votes.csv’. This dataset votes consists of over 3000 counties in the

United States along with their socioeconomic and demographic information and voting

records in the 2016 US election; each row corresponds to a single county.

(2c) (Code) The response variable will be prefer trump, which is 0 or 1 indicating whether

the percentage of people who voted for Trump in that county is greater than that who voted

for Clinton. Compute the response variable y.

(2d) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ to implement polynomial kernel

SVM with C = 10.0, max iter=1e6. Implement SVM with kernel degree set to 1, then 2, 3,

4, and 5. For each model, report:

(1) Number of support vectors.

(2) Prediction error (ratio) in test set X train.

(3) Prediction error (ratio) in test set X test.

(2e) (Free Response) Based on the 5 models trained in (2d), how does the predictive error

change with the degree of the polynomial kernel? Explain why. How does the number of

support vectors change with the degree of the polynomial kernel? Explain why.

联系我们

- QQ：99515681
- 邮箱：99515681@qq.com
- 工作时间：8:00-23:00
- 微信：codinghelp2

- Tsp课程作业代写、代做algorithms留学生作业、代做java，C/C 2020-06-23
- Kit107留学生作业代做、C++编程语言作业调试、Data课程作业代写、代 2020-06-23
- Sta302h1f作业代做、代写r课程设计作业、代写r编程语言作业、代做da 2020-06-22
- 代写seng 474作业、代做data Mining作业、Python，Ja 2020-06-22
- Cmpsci 187 Binary Search Trees 2020-06-21
- Comp226 Assignment 2: Strategy 2020-06-21
- Math 504 Homework 12 2020-06-21
- Math4007 Assessed Coursework 2 2020-06-21
- Optimization In Machine Learning Assig... 2020-06-21
- Homework 1 – Math 104B 2020-06-20
- Comp1000 Unix And C Programming 2020-06-20
- General Specifications Use Python In T... 2020-06-20
- Comp-206 Mini Assignment 6 2020-06-20
- Aps 105 Lab 9: Search And Link 2020-06-20
- Aps 105 Lab 9: Search And Link 2020-06-20
- Mech 203 – End-Of-Semester Project 2020-06-20
- Ms980 Business Analytics 2020-06-20
- Cs952 Database And Web Systems Develop... 2020-06-20
- Homework 4 Using Data From The China H... 2020-06-20
- Assignment 1 Build A Shopping Cart 2020-06-20