### NPTEL Introduction To Machine Learning IITKGP ASSIGNMENT 2

**NPTEL Introduction To Machine Learning ASSIGNMENT 2**

Link to Enroll: Click Here

**1. In a binary classification problem, out of 30 data points 12 belong to class I and 18 belong to class II. What is the entropy of the data set?**

A. 0.97

B 0

**C. 1**

D. 0.67

**Answer:-**** c**

**2. Decision trees can be used for the problems where**

A. the attributes are categorical.

B. the attributes are numeric valued.

C. the attributes are discrete valued.

**D. In all the above cases.**

**Answer:-**** d**

**3. Which of the following is false?**

A. Variance is the error of the trained classifier with respect to the best classifier in the concept class.

B. Variance depends on the training set size.

**C. Variance increases with more training data.**

D. Variance increases with more complicated classifiers.

**Answer:-**** c**

**4. In linear regression, our hypothesis is h (x) = 6+ 0x, the training data is given in the table. A**What is the value of J(0) when 6 = (1,1).

A. 0

B. 1

C. 2

D. 0.5

**Answer:- b**

**NPTEL Introduction To Machine Learning ASSIGNMENT 2**

**5. The value of information gain in the following decision tree is:**

**A. 0.380**

B. 0.620

C. 0.190

D. 0.477

**Answer:-**** a**

**6. What is true for Stochastic Gradient Descent?**

A. In every iteration, model parameters are updated for multiple training samples

**B. In every iteration, model parameters are updated for one training sample**

C. In every iteration, model parameters are updated for all training samples

D. None of the above

**Answer:-**** b**

**7. The entropy of the entire dataset is**

A. 0.5

B. 1

C. 0

D. 0.1

**Answer:-**** c**

**8. Which attribute will be the root of the decision tree?**

A. Green

**B. Legs**

C. Height

D. Smelly

**Answer:-**** b**

**9. In Linear Regression the output is:**

A. Discrete

B. Continuous and always lies in a finite range

**C. Continuous**

D. May be discrete or continuous

**Answer:-**** c**

**10. Identify whether the following statement is true or false? Overfitting is more likely when the set of training data is small**

**A. True**

B. False

**Answer:-**** a**

More From NPTEL: Click Here

* The material and content uploaded on this website are for general information and reference purposes only. Please do it by your own first.