### NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 4

### These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 4 WEEK 4

Course Name: INTRODUCTION TO MACHINE LEARNING

Q1. Consider the 1-dimensional dataset:
State true or false: The dataset becomes linearly separable after using basis expansion with the following basis function ϕ(x)=[1×3]ϕ(x)=[1×3]

a. True
b.False

Q2. Consider a linear SVM trained with nn labeled points in R2R2 without slack penalties and resulting in k=2k=2 support vectors, where n>100n>100. By removing one labeled training point and retraining the SVM classifier, what is the maximum possible number of support vectors in the resulting solution?
a. 1
b. 2
c. 3
d. n − 1
e. n

These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 4 WEEK 4

Q3. Which of the following are valid kernel functions?
a. (1+<x,x’>)d(1+<x,x′>)d
b. tanh(K1<x,x’>+K2)
c. exp(−γ||x−x’||2)

Answer: a, b, c

Q4. Consider the following dataset:

Which of these is not a support vector when using a Support Vector Classifier with a polynomial kernel with degree =3,C=1,=3,C=1, and gamma =0.1?=0.1?
a. 3
b.1
c. 9
d. 10

These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 4 WEEK 4

Q5. Consider an SVM with a second order polynomial kernel. Kernel 1 maps each input data point xx to K1(x)=[x x2]. Kernel 2 maps each input data point xx to K2(x)=[3x 3×2]K2(x). Assume the hyper-parameters are fixed. Which of the following option is true?
a. The margin obtained using K2(x)K2(x) will be larger than the margin obtained using K1(x)K1(x).
b. The margin obtained using K2(x)K2(x) will be smaller than the margin obtained using K1(x)K1(x).
c. The margin obtained using K2(x)K2(x) will be the same as the margin obtained using K1(x)K1(x).

Answer: c. The margin obtained using K2(x)K2(x) will be the same as the margin obtained using K1(x)K1(x).

Q6. Train a Linear perceptron classifier on the modified iris dataset. Report the best classification accuracy for l1 and elasticnet penalty terms.
a. 0.82, 0.64
b. 0.90, 0.71
c. 0.84, 0.82
d. 0.78, 0.64

Answer: b. 0.90, 0.71

These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 4 WEEK 4

Q7. Train an SVM classifier on the modified iris dataset. We encourage you to explore the impact of varying different hyperparameters of the model. Specifically, try different kernels and the associated hyperparameters. As part of the assignment, train models with the following set of hyperparameters
poly, gamma=0.4gamma=0.4, one-vs-rest classifier, no-feature-normalization.

a. 0.98
b. 0.96
c. 0.92
d. 0.94

These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 4 WEEK 4

More weeks solution of this course: https://progies.in/answers/nptel/introduction-to-machine-learning

More NPTEL Solutions: https://progies.in/answers/nptel

* The material and content uploaded on this website are for general information and reference purposes only. Please do it by your own first.

### Problem Solving Through Programming In C NPTEL Assignment 9