Deep Learning | NPTEL | Week 10 answers

This set of MCQ(multiple choice questions) focuses on the Deep Learning NPTEL Week 10 answers

Course layout

Answers COMING SOON! Kindly Wait!

Week 1: Assignment Answers
Week 2:
Assignment Answers
Week 3: Assignment Answers

Week 4: Assignment Answers
Week 5: Assignment Answers
Week 6: Assignment Answers
Week 7: Assignment Answers
Week 8: Assignment Answers
Week 9: Assignment Answers
Week 10: Assignment Answers
Week 11: Assignment Answers
Week 12:
Assignment Answers

NOTE: You can check your answer immediately by clicking show answer button. This set ofDeep Learning NPTEL Week 10 answers ” contains 10 questions.

Now, start attempting the quiz.

Deep Learning NPTEL 2022 Week 10 answers

Q1. What is not a reason for using batch-normalization?

a) Prevent overfitting
b) Faster convergence
c) Faster inference time
d) Prevent Co-variant shift

Answer: c)


Answer: b)

Q3. How can we prevent underfitting?

a) Increase the number of data samples
b) Increase the number of features
c) Decrease the number of features
d) Decrease the number of data samples

Answer: d)

Q4. How do we generally calculate mean and variance during testing?

a) Batch normalization is not required during testing
b) Mean and variance based on test image
c) Estimated mean and variance statistics during training
d) None of the above

Answer: c)

Q5. Which one of the following is not an advantage of dropout?

a) Regularization
b) Prevent Overfitting
c) Improve Accuracy
d) Reduce computational cost during testing

Answer: c)

Deep Learning NPTEL Week 10 Answers

Q6. What is the main advantage of layer normalization over batch normalization?

a) Faster convergence
b) Lesser computation
c) Useful in recurrent neural network
d) None of these

Answer: c)

Q7. While training a neural network for image recognition task, we plot the graph of training error and validation error. Which is the best for early stopping?

a) A
b) B
c) C
d) D

Answer: d)

Deep Learning NPTEL Week 10 Answers

Q8. Which among the following is NOT a data augmentation technique?

a) Random horizontal and vertical flip of image
b) Random shuffle all the pixels of an image
c) Random color jittering
d) All the above are data augmentation techniques

Answer: d)

Q9. Which of the following is true about model capacity (where model capacity means the ability of neural network to approximate complex functions)?

a) As number of hidden layers increase, model capacity increases
b) As dropout ratio increases, model capacity increases
c) As learning rate increases, model capacity increases
d) None of these

Answer: a)

Q10. Batch Normalization is helpful because

a) It normalizes all the input before sending it to the next layer
b) It returns back the normalized mean and standard deviation of weights
c) It is a very efficient back-propagation technique
d) None of these

Answer: b)

<< Prev: Deep Learning NPTEL Week 9 Answers

>> Next: Deep Learning NPTEL Week 11 Answers

Disclaimer: Quizermaina doesn’t claim these answers to be 100% correct. Use these answers just for your reference. Always submit your assignment as per the best of your knowledge.

For discussion about any question, join the below comment section. And get the solution of your query. Also, try to share your thoughts about the topics covered in this particular quiz.

1 thought on “Deep Learning | NPTEL | Week 10 answers”

Leave a Comment

Your email address will not be published. Required fields are marked *