Introduction to Machine Learning NPTEL Week 5 Solutions NPTEL 2023

This set of MCQ(multiple choice questions) focuses on the Introduction to Machine learning NPTEL Week 5 Solutions NPTEL 2023.

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.

Course layout

Answers COMING SOON! Kindly Wait!

Week 1: Assignment answers
Week 2: Assignment answers
Week 3: Assignment answers
Week 4: Assignment answers
Week 5: Assignment answers
Week 6: Assignment answers
Week 7: Assignment answers
Week 8: Assignment answers
Week 9: Assignment answers
Week 10: Assignment answers
Week 11: Assignment answers
Week 12: Assignment answers

NOTE: You can check your answer immediately by clicking show answer button. Introduction to Machine Learning NPTEL Week 5 Solutions” contains 7 questions.

Now, start attempting the quiz.

Introduction to Machine learning NPTEL Week 5 Solutions

Q1. What would be the ideal complexity of the curve which can be used for separating the two classes shown in the image below?

a) Linear
b) Quadratic
c) Cubic
d) insufficient data to draw a conlcusion

Answer: a)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q2. If you remove the following any one red points from the data. Will the decision boundary change?

a) Yes
b) No

Answer: a)

Q3. What do you mean by a hard margin in SVM Classification?

a) The SVM allows very low error in classification
b) The SVM allows high amount of error in classification
c) Both are True
d) Both are False

Answer: a)

Q4. Which of the following statements accurately compares linear regression and logistic regression?

a) Linear regression is used for classification tasks, while logistic regression is used for regression tasks.
b) Linear regression models the relationship between input features and continuous target variables, while logistic regression models the probability of binary outcomes.
c) Linear regression and logistic regression are identical in their mathematical formulation and can be used interchangeably.
d) Linear regression and logistic regression both handle multi-class classification tasks equally effectively.

Answer: b)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q5. After training an SVM, we can discard all examples which are not support vectors and can still classify new examples?

a) True
b) False

Answer: a)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q6. Suppose you are building a SVM model on data X. The data X can be error prone which means that you should not trust any specific data point too much. Now think that you want to build a SVM model which has quadratic kernel function of polynomial degree 2 that uses Slack variable C as one of it’s hyper parameter.
What would happen when you use very large value of C (C->infinity)?

a) We can still classify data correclty for given setting of the hyper parameter C.
b) We can not still classify data correctly for given setting of hyper parameter C.
c) None of the above

Answer: a)

Q7. Following Question 6, what would happen when you use very small C (C~0)?

a) Data will be correctly classified
b) Misclassification would happen
c) None of these

Answer: b)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q8. If g(z) is the sigmoid function, then its derivative with respect ot z may be written in term of g(z) as

a) g(z)(1-g(z))
b) g(z)(1+g(z))
c) -g(z)(1+g(z))
d) g(z)(g(z)-1)

Answer: a)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q9. In the lienarly non-separable case, what effect does the C parameter have on the SVM mode.

a) it determines how many data points lie within the margin
b) it is a count of the number of data points which do not lie on their respective side of the hyperplane.
c) it allows us to trade-off the number of misclassified points in the training data and the size of the margin
d) it counts the support vectors

Answer: c)

Q10. What type of kernel function is commonly used for non-linear classification tasks in SVM?

a) Linear kernel
b) Polynomial kernel
c) Sigmoid kernel
d) Radial Basis Function (RBF) kernel

Answer: d)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q11. Which of the following statements is/are true about kernel in SVM?
1. Kernel function map low dimensional data to high dimensional space
2. It’s a similarity function

a) 1 is true but 2 is False
b) 1 is False but 2 is True
c) Both are True
d) Both are False

Answer: c)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q12. The soft-margin SVM is prefered over the hard-margin SVM when:

a) The data is linearly separable
b) The data is noisy
c) The data contains overlapping point

Answer: b), c)

Q13. Let us assume that the black-colored circles represent positive class whereas the white-colored circles represent negative class. Which of the following among H1, H2 and H3 is the maximum-margin hyperlane?

a) H1
b) H2
c) H3
d) None of the above

Answer: c)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q14. What is the primary advantage of Kernel SVM compared to traditional SMV with a linear kernel?

a) Kernel SVM requires less computational resources
b) Kernel SVM does not require tuning of hyperparameters
c) Kernel SVM can capture complex non-linear relationships between data points.
d) Kernel SVM is more robust to noisy data

Answer: c)

Introduction to Machine Learning NPTEL Week 5 Solutions

Q15. What is the sigmoid function’s role in logistic regression?

a) The sigmoid function transforms the input features to a higher-dimensional space.
b) The sigmoid function claculates the dot product of input features and weights.
c) The sigmoid function defines the learning rate for gradient descent.
d) The sigmoid function maps the linear combination of features to a probability value.

Answer: d)

Introduction to Machine Learning NPTEL Week 5 Solutions

<< Previous- Introduction to Machine Learning Week 4 Assignment Solutions

>> Next- Introduction to Machine Learning Week 6 Assignment Solutions


DISCLAIMER: Use these answers only for the reference purpose. Quizermania doesn't claim these answers to be 100% correct. So, make sure you submit your assignments on the basis of your knowledge.

For discussion about any question, join the below comment section. And get the solution of your query. Also, try to share your thoughts about the topics covered in this particular quiz.

Checkout for more NPTEL Courses: Click Here!

Leave a Comment

Your email address will not be published. Required fields are marked *