NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 8

                     

These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 8 WEEK 8

Course Name: INTRODUCTION TO MACHINE LEARNING

Link to Enroll: Click Here


Q1. The figure below shows a Bayesian Network with 9 variables, all of which are binary.
Which of the following is/are always true for the above Bayesian Network?
a. P(A,B|G)=P(A|G)P(B|G)P(A,B|G)=P(A|G)P(B|G)
b. P(A,I)=P(A)P(I)P(A,I)=P(A)P(I)
c. P(B,H|E,G)=P(B|E,G)P(H|E,G)P(B,H|E,G)=P(B|E,G)P(H|E,G)
d. P(C|B,F)=P(C|F)P(C|B,F)=P(C|F)

Answer: c, d


Q2. Consider the following data for 20 budget phones, 30 mid-range phones, and 20 high-end phones:
Consider a phone with 2 SIM card slots and NFC but no 5G compatibility. Calculate the probabilities of this phone being a budget phone, a mid-range phone, and a high-end phone using the Naive Bayes method. The correct ordering of the phone type from the highest to the lowest probability is?

a. Budget, Mid-Range, High End
b. Budget, High End, Mid-Range
c. Mid-Range, High End, Budget
d. High End, Mid-Range, Budget

Answer: c. Mid-Range, High End, Budget


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 8 WEEK 8


Q3. Consider the following dataset where outlook, temperature, humidity, and wind are independent features, and play is the dependent feature.
Find the probability that the student will not play given that x = (Outlook=sunny, Temperature=66, Humidity=90, Windy=True) using the Naive Bayes method. (Assume the continuous features are represented as Gaussian distributions).
a. 0.0001367
b. 0.0000358
c. 0.0000236
d. 1

Answer: c. 0.0000236


Q4. Which among Gradient Boosting and AdaBoost is less susceptible to outliers considering their respective loss functions?
a. AdaBoost
b. Gradient Boost
c. On average, both are equally susceptible.

Answer: b. Gradient Boost


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 8 WEEK 8


Q5. How do you prevent overfitting in random forest models?
a. Increasing Tree Depth.
b. Increasing the number of variables sampled at each split.
c. Increasing the number of trees.
d. All of the above.

Answer: d. All of the above.


Q6. A dataset with two classes is plotted below.
Does the data satisfy the Naive Bayes assumption?
a. Yes
b. No
c. The given data is insufficient
d. None of these

Answer: a. Yes


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 8 WEEK 8


Q7. Ensembling in random forest classifier helps in achieving:
a. reduction of bias error
b. reduction of variance error
c. reduction of data dimension
d. none of the above

Answer: c. reduction of data dimension


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 8 WEEK 8

More weeks solution of this course: https://progies.in/answers/nptel/introduction-to-machine-learning

More NPTEL Solutions: https://progies.in/answers/nptel


* The material and content uploaded on this website are for general information and reference purposes only. Please do it by your own first.


These are the solutions of NPTEL INTRODUCTION TO MACHINE LEARNING ASSIGNMENT 8 WEEK 8