NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 6

                     

These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 6 WEEK 6

Course Name: INTRODUCTION TO MACHINE LEARNING

Link to Enroll: Click Here


Q1. Which of the following properties are characteristic of decision trees?
a. Low bias
b. High variance
c. Lack of smoothness of prediction surfaces
d. Unbounded parameter set

Answer: b, c, d


Q2. Consider the following dataset :
What is the initial entropy of Malignant?

a. 0.543
b. 0.9798
c. 0.8732
d. 1

Answer: b. 0.9798


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 6 WEEK 6


Q3. For the same dataset, what is the info gain of Vaccination?
a. 0.4763
b. 0.2102
c. 0.1134
d. 0.9355

Answer: b. 0.2102


Q4. Consider the following statements:
Statement 1: Decision Trees are linear non-parametric models.
Statement 2: A decision tree may be used to explain the complex function learned by a neural network.
a. Both the statements are True.
b. Statement 1 is True, but Statement 2 is False.
c. Statement 1 is False, but Statement 2 is True.
d. Both the statements are False.

Answer: c. Statement 1 is False, but Statement 2 is True.


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 6 WEEK 6


Q5. Which of the following machine learning models can solve the XOR problem without any transformations on the input space?
a. Linear Perceptron
b. Neural Networks
b. Decision Trees
d. Logistic Regression

Answer: b, c


Q6. Which of the following is/are major advantages of decision trees over other supervised learning techniques (Note that more than one choices may be correct)
a. Theoretical guarantees of performance
b. Higher performance
c. Interpretability of classifier
d. More powerful in its ability to represent complex functions

Answer: a, b, c ,d


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 6 WEEK 6


Q7. Consider a dataset with only one attribute(categorical). Suppose there are q unordered values in this attribute. How many possible combinations are needed to find the best split-point for building the decision tree classifier?
a. q
b. q2
c. 2q-1
d. 2q-1 – 1

Answer: d. 2q-1 – 1


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 6 WEEK 6

More weeks solution of this course: https://progies.in/answers/nptel/introduction-to-machine-learning

More NPTEL Solutions: https://progies.in/answers/nptel


* The material and content uploaded on this website are for general information and reference purposes only. Please do it by your own first.


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 6 WEEK 6