首页 >
> 详细

Optimization in Machine Learning (2020 Winter)

Assignment 2

Instructions: For the code parts, submit the completed files on Canvas (both .py file

and .ipynb file are accepted). For the free response parts, type your solution into a seperate

electronic document (.pdf file). Physical submissions will NOT be accepted.

To submit, compress all your files into a single compressed file (.zip file). E-mail a softcopy

of your code and answers to rkwon@mie.utoronto.ca.

If you have any questions about this assignment, please e-mail yhe@mie.utoronto.ca.

1. Linear Support Vector Machine

Download the dataset ‘prob1data.csv’. The dataset consists of two features as first two

columns and its classification as the third column (0 and 1 refer to two distinct types.) A

plot of the dataset is given below:

In this problem, you will solve both primal and dual optimization problem of the linear

soft-margin support vector machine using CVXPY and analyze the results.

(1a) (Code + Free Response) Complete the function ‘LinearSVM Primal0

that solve the

primal optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the

optimization problem and report:

(1) The optimal decision boundary.

(2) The optimal support vectors.

(3) The solution time.

1

(1b) (Code + Free Response) Complete the function ‘LinearSVM Dual’ that solve the dual

optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the optimization problem and report:

(1) The optimal dual solution.

(2) The optimal decision boundary.

(3) The optimal support vectors.

(4) The solution time.

(1c) (Free Response) Discuss if the decision boundary of the linear SVM will change with

increased and decreased C value. If the decision boundary changes, briefly discuss how it

changes with C and why. If the decision boundary does not change with C, discuss the reason.

(1d) (Code + Free Response) Complete the function ‘Linearly separable’ that output 1 if

the dataset is linearly separable and 0 otherwise. Determine if the given dataset is linearly

separable. For any given dataset with multiple features, how can one conclude if the dataset

is linearly separable based on the optimal solution (optimal decision boundary) and optimal

objective function value solved? (Hint: consider varying C values.)

In this following problems, we will consider an alternative soft-margin method, known as the

l2 norm soft margin SVM. This new algorithm is given by the following primal optimization

problem (notice that the slack penalties are now squared, n is the total number of datapoints,

(2a) (Code) Complete the function ‘gaussian kernel sigma’ that returns a function ‘gaussian kernel’

with the specified σ value.

(2b) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ and ‘gaussian kernel sigma’

coded in (2a) to build a kernel SVM to classify the train data X train. Use C = 1 and

σ = 0.1. Report:

(1) Number of support vectors.

(2) Prediction error (ratio) in test set X test.

(3) Plot decision boundary approximately.

Download the files ‘votes.csv’. This dataset votes consists of over 3000 counties in the

United States along with their socioeconomic and demographic information and voting

records in the 2016 US election; each row corresponds to a single county.

(2c) (Code) The response variable will be prefer trump, which is 0 or 1 indicating whether

the percentage of people who voted for Trump in that county is greater than that who voted

for Clinton. Compute the response variable y.

(2d) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ to implement polynomial kernel

SVM with C = 10.0, max iter=1e6. Implement SVM with kernel degree set to 1, then 2, 3,

4, and 5. For each model, report:

(1) Number of support vectors.

(2) Prediction error (ratio) in test set X train.

(3) Prediction error (ratio) in test set X test.

(2e) (Free Response) Based on the 5 models trained in (2d), how does the predictive error

change with the degree of the polynomial kernel? Explain why. How does the number of

support vectors change with the degree of the polynomial kernel? Explain why.

联系我们

- QQ：99515681
- 邮箱：99515681@qq.com
- 工作时间：8:00-23:00
- 微信：codinghelp2

- 代写data留学生作业、代写program作业、Python程序语言作业调试 2020-09-23
- Data Programming作业代写、Python编程语言作业调试、代写 2020-09-23
- Comp1600作业代做、代写java编程作业、代做c/C++，Python 2020-09-23
- Ict365课程作业代做、代写c++程序设计作业、代做c/C++实验作业调试 2020-09-23
- 代写program留学生作业、Python，Java，C++程序语言作业调试 2020-09-23
- Cs 2510课程作业代做、代写systems留学生作业、Python编程设 2020-09-22
- B-Tree作业代做、代做java课程设计作业、Java程序语言作业调试、代 2020-09-22
- 代做cop3503作业、代写c++程序语言作业、代做c++课程设计作业、代做 2020-09-22
- 代写159.336留学生作业、代做python编程设计作业、代写java，C 2020-09-22
- Software课程作业代写、C++程序语言作业调试代做、代做data留学生 2020-09-22
- 代写159.336课程作业、代做python，Java程序语言作业、代写c/ 2020-09-21
- Comp20003作业代写、代做data Structures作业、C++程 2020-09-21
- 代做program课程作业、代写java，C++，Python程序语言作业、 2020-09-21
- Data留学生作业代做、代写program课程作业、C/C++编程设计作业调 2020-09-21
- Final Projects 1 2020-09-20
- Comp9021 Assignment 2 2020-09-20
- Mathematical Question 2020-09-20
- Cs825 Assignment 4 2020-09-20
- Ipal Programming In R Week 3 2020-09-20
- Programs作业代写、C++编程语言作业调试、C/C++实验作业代做、代 2020-09-20