### NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5

**These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5 WEEK 5**

**These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5 WEEK 5**

Course Name: INTRODUCTION TO MACHINE LEARNING

Link to Enroll: Click Here

**Q1. If the step size in gradient descent is too large, what can happen?**

a. Overfitting

b. The model will not converge

c. We can reach maxima instead of minima

d. None of the above

**Answer: b. The model will not converge**

**Q2. Recall the XOR(tabulated below) example from class where we did a transformation of features to make it linearly separable. Which of the following transformations can also work?**

a. X‘1=X21,X‘2=X22X′1=X12,X′2=X22

b. X‘1=1+X1,X‘2=1−X2X′1=1+X1,X′2=1−X2

c. X‘1=X1X2,X‘2=−X1X2X′1=X1X2,X′2=−X1X2

d. X‘1=(X1−X2)2,X‘2=(X1+X2)2X′1=(X1−X2)2,X′2=(X1+X2)2

**Answer: c. X‘1=X1X2,X‘2=−X1X2X′1=X1X2,X′2=−X1X2**

**These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5 WEEK 5**

**Q3. What is the effect of using activation function f(x)=xf(x)=x for hidden layers in an ANN?**

a. No effect. It’s as good as any other activation function (sigmoid, tanh etc).

b. The ANN is equivalent to doing multi-output linear regression.

c. Backpropagation will not work.

d. We can model highly complex non-linear functions.

**Answer: b. The ANN is equivalent to doing multi-output linear regression.**

**Q4. Which of the following functions can be used on the last layer of an ANN for classification?**

a. Softmax

b. Sigmoid

c. Tanh

d. Linear

**Answer: b, c**

**These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5 WEEK 5**

**Q5. Statement: Threshold function cannot be used as activation function for hidden layers.Reason: Threshold functions do not introduce non-linearity.**

a. Statement is true and reason is false.

b. Statement is false and reason is true.

c. Both are true and the reason explains the statement.

d. Both are true and the reason does not explain the statement.

**Answer: a. Statement is true and reason is false.**

**Q6. We use several techniques to ensure the weights of the neural network are small (such as random initialization around 0 or regularisation). What conclusions can we draw if weights of our ANN are high?**

a. Model has overfitted.

b. It was initialized incorrectly.

c. At least one of (a) or (b).

d. None of the above.

**Answer: c. At least one of (a) or (b).**

**These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5 WEEK 5**

**Q7. On different initializations of your neural network, you get significantly different values of loss. What could be the reason for this?**

a. Overfitting

b. Some problem in the architecture

c. Incorrect activation function

d. Multiple local minima

**Answer: a. Overfitting**

**Q8. The likelihood L(θ|X)L(θ|X) is given by:**

a. P(θ|X)P(θ|X)

b. P(X|θ)P(X|θ)

c. P(X).P(θ)P(X).P(θ)

d. P(θ)P(X)P(θ)P(X)

**Answer: b. P(X|θ)P(X|θ)**

**These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5 WEEK 5**

**Q9. You are trying to estimate the probability of it raining today using maximum likelihood estimation. Given that in nn days, it rained nrnr times, what is the probability of it raining today?**

a. nrnnrn

b. nrnr+nnrnr+n

c. nnr+nnnr+n

d. None of the above.

**Answer: a. nrnnrn**

**Q10. Choose the correct statement (multiple may be correct):**

a. MLE is a special case of MAP when prior is a uniform distribution.

b. MLE acts as regularisation for MAP.

c. MLE is a special case of MAP when prior is a beta distribution .

d. MAP acts as regularisation for MLE.

**Answer: a, d**

**These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 5 WEEK 5**

More weeks solution of this course: https://progies.in/answers/nptel/introduction-to-machine-learning

More NPTEL Solutions: https://progies.in/answers/nptel

* The material and content uploaded on this website are for general information and reference purposes only. Please do it by your own first.