1000 Machine Learning MCQ (Multiple Choice Questions) - Sanfoundry
1000 Machine Learning MCQ (Multiple Choice Questions) - Sanfoundry
Questions)
Here are 1000 MCQs on Machine Learning (Chapterwise).
View Answer
Answer: c
Explanation: Machine learning is the autonomous acquisition of knowledge through the
use of computer programs.
ADVERTISEMENT
View Answer
Answer: a
Explanation: KNN doesn’t build a parametric model of the data. Instead, it directly
classifies new data points based on the k nearest points in the training data.
View Answer
Answer: a
Explanation: Decision tree, SVM (Support vector machines) for classification problems
and Naïve Bayes are the examples of supervised machine learning algorithm. K-means is
an example of unsupervised machine learning algorithm.
advertisement
4. What’s the key benefit of using deep learning for tasks like recognizing images?
:
a) They need less training data than other methods.
b) They’re easier to explain and understand than other models.
c) They can learn complex details from the data on their own.
d) They work faster and are more e!cient computationally.
View Answer
Answer: c
Explanation: Deep learning is great at figuring out intricate details from data, especially
in tasks like recognizing images.
View Answer
Answer: b
Explanation: Decision Trees are versatile and can be used for classification problems,
particularly for binary classification, where the output is divided into two classes.
ADVERTISEMENT
View Answer
Answer: a
Explanation: The presence or absence of labeled data in the training set distinguishes
supervised and unsupervised learning approaches.
7. Which type of machine learning algorithm falls under the category of “unsupervised
learning”?
a) Linear Regression
b) K-means Clustering
c) Decision Trees
d) Random Forest
View Answer
Answer: b
Explanation: K-means Clustering is an example of unsupervised learning used for
clustering unlabeled data based on similarities.
advertisement
View Answer
Answer: c
Explanation: AdaBoost is generally not more prone to overfitting but is less prone to
overfitting. And it is prone to overfitting on noisy datasets. If you use very simple weak
learners, then the algorithms are much less prone to overfitting and it improves
classification accuracy. So Complexity of the weak learner is important in AdaBoost.
9. Which one of the following models is a generative model used in machine learning?
a) Support vector machines
b) Naïve Bayes
c) Logistic Regression
d) Linear Regression
View Answer
Answer: b
Explanation: Naïve Bayes is a type of generative model which is used in machine
learning. Linear Regression, Logistic Regression and Support vector machines are the
types of discriminative models which are used in machine learning.
advertisement
:
10. An artificially intelligent car decreases its speed based on its distance from the car in
front of it. Which algorithm is used?
a) Naïve-Bayes
b) Decision Tree
c) Linear Regression
d) Logistic Regression
View Answer
Answer: c
Explanation: The output is numerical. It determines the speed of the car. Hence it is not
a classification problem. All the three, decision tree, naïve-Bayes, and logistic regression
are classification algorithms. Linear regression, on the other hand, outputs numerical
values based on input. So, this can be used.
View Answer
Answer: b
Explanation: Ensemble learning is not an unsupervised learning algorithm. It is a
supervised learning algorithm that combines several machine learning techniques into
one predictive model to decrease variance and bias. It can be trained and then used to
make predictions. And this ensemble can be shown to have more flexibility in the
functions they can represent.
12. Which of the following statements is true about stochastic gradient descent?
a) It processes one training example per iteration
b) It is not preferred, if the number of training examples is large
c) It processes all the training examples for each iteration of gradient descent
d) It is computationally very expensive, if the number of training examples is large
View Answer
Answer: a
Explanation: Stochastic gradient descent processes one training example per iteration.
:
That is it updates the weight vector based on one data point at a time. All other three
are the features of Batch Gradient Descent.
13. Decision tree uses the inductive learning machine learning approach.
a) False
b) True
View Answer
Answer: b
Explanation: Decision tree uses the inductive learning machine learning approach.
Inductive learning enables the system to recognize patterns and regularities in previous
knowledge or training data and extract the general rules from them. A decision tree is
considered to be an inductive learning task as it uses particular facts to make more
generalized conclusions.
View Answer
Answer: d
Explanation: A set of instances is required. A set of candidate hypotheses are given.
These are applied to the training data and the list of accurate hypotheses is output in
accordance with the candidate-elimination algorithm.
View Answer
Answer: a
Explanation: Boosting does not increase the bias and variance but it mainly reduces the
:
bias and the variance. It is a technique for solving two-class classification problems. And
it tries to generate complementary base-learners by training the next learner (by
increasing the weights) on the mistakes (misclassified data) of the previous learners.
2. Version Spaces
3. VC-Dimension
4. Linear Regression
6. Logistic Regression
7. Ensemble Learning
9. Kernels
The section contains multiple choice questions and answers on statistical learning
:
framework, empirical minimization framework and PAC learning.
The section contains questions and answers on version spaces, find-s algorithm and
candidate elimination algorithm.
Find-S Algorithm
The section contains Machine Learning MCQs on VC-dimension and the Fundamental
Theorem of PAC Learning.
VC-Dimension – Set 2
The section contains Machine Learning multiple choice questions and answers on linear
regression in machine learning, linear regression cost functions, and gradient descent.
The section contains Machine Learning questions and answers on multivariate linear
:
regression, gradient descent for multiple variables, and polynomial regression.
The section contains multiple choice questions and answers on ensemble learning,
covering error-correcting output codes, model combination schemes, boosting weak
learnability, the AdaBoost algorithm, and stacking.
The section contains Machine Learning MCQs on kernels and kernel trick.
The section contains multiple choice questions and answers on support vector machines
(SVMs), covering key concepts like the large margin intuition, margins and hard/soft SVMs,
norm regularization, optimality conditions and support vectors, and finally, implementing
soft SVMs using Stochastic Gradient Descent (SGD).
The section contains questions and answers on decision trees, covering core concepts
such as decision tree pruning, inductive bias, classification trees, regression trees, and the
powerful Random Forest algorithm.
The section contains MCQs on K-Nearest Neighbor Algorithm and Nearest Neighbor
Analysis.
The section contains multiple choice questions and answers on Naive-Bayes Algorithm.
Naive-Bayes Algorithm
The section contains multiple choice questions and answers on nonlinear hypothesis,
neurons and the brain, model representation, multiclass classification, cost function,
gradient checking, and random initialization.
Wish you the best in your endeavor to learn and master Machine Learning!
If you find a mistake in question / option / answer, kindly take a screenshot and
email to [email protected]
advertisement
ADVERTISEMENT
Recommended Articles:
1. Artificial Intelligence Questions and Answers – Machine Learning
6. Machine Learning Questions and Answers – Gradient Descent for Multiple Variables
:
7. Machine Learning Questions and Answers – Logistic Regression – Advanced
Optimization
9. Machine Learning Questions and Answers – Logistic Regression – Cost Function and
Gradient Descent
advertisement
ADVERTISEMENT
Additional Resources:
Popular Pages:
Name*
Email*
Subscribe
.
About | Certifications | Internships | Jobs | Privacy Policy | Terms | Copyright | Contact