ML Questions
ML Questions
Unit 2
Short answer Questions:
1. Define Linear Regression.
2. Compare Ride Regression, Lasso Regression.
3. Define Decision Tree.
4. Describe multipe Linear Regression.
5. Define Bayesian Liner regression.
6. Explain Laplace approximation.
7. Define Probabilistic Generative models.
23
8. Describe random forest model.
9. Define nearest neighbour?
10. Define logistic regression.
Long answer questions:
1. Explain different Regression Algorithms. (Linear Regression, Multiple Linear Regression,
Polynomial Regression, Logistic Regression, Ride Regression, Lasso Regression.)
2. Explain the steps involved in Laplace Approximation.
3. Explain K-nearest neighbor algorithm(KNN) with an example.
4. Define Decision Tree and Explain the ID3 Decision Tree learning Algorithm with an example.
5. Discuss Probabilistic Generative models.
6. Explain Probabilistic Discriminative models.
7. Explain Simple Linear Regression with an example.
8. Explain Logistic Regression with suitable example.
9. Describe the assumptions of Linear Regression model.
10. Describe random forest model with an example.
11. Discuss Support Vector Machine model with an example.
12. Explain Bayesian Linear Regression with an example.
13. Explain Discriminant Functions used in Classification.
14. Explain Bias-Variance Decomposition.
15. Differentiate between Linear Regression and Logistic Regression.
16. Explain Ride Regression and Lasso Regression with suitable equation.
17. Calculate the entropy for the collection of training examples given in the below table and find the
information gain of attributes A1 and A2 using ID3 algorithm.
Instance Target Class A1 A2
(Example)
1 Yes T T
2 Yes T T
3 No T F
4 Yes F F
5 No F T
6 No F T
18. Apply K-Nearest Neighbor Algorithm for the following dataset to classify new data instance (157,
54) and Assume K=3.
Height Weight Target Concept
150 50 Medium
155 55 Medium
160 60 Large
161 59 Large
158 65 Large
24
19. Apply K-Nearest Neighbor Algorithm for the following dataset to classify new data instance (170,
57) and Assume K=5.
Height Weight Target Concept
(Centimeters) (KG) Class
167 51 Underweight
182 62 Normal
176 69 Normal
173 64 Normal
172 65 Normal
174 56 Underweight
169 58 Normal
173 57 Normal
170 55 Normal
20. Construct Decision tree for the following data to make decision about the tree is Oak or Pine using
ID3 Algorithm.
Density Grain Hardness Class
Heavy Small Hard Oak
Heavy Large Hard Oak
Heavy Small Hard Oak
Light Large Soft Oak
Light Large Hard Pine
Heavy Small Soft Pine
Heavy Large Soft Pine
Heavy Small Soft Pine
21. Apply simple linear regression model for the following dataset to predict the glucose level for the
new person age is 55.
Example Age Glucose Level
Number (X) (Y)
1 43 99
2 21 65
3 25 79
4 42 75
5 57 87
6 59 81
Unit 3
Short answer Questions:
1. Define Clustering.
2. Define hard and soft clustering.
3. What is hierarchal clustering?
25
4. Describe Spectral Clustering.
5. What is dimensionality reduction?
6. Define Linear Discriminant Analysis.
7. Define Principal Component analysis.
8. Explain Distance metrics used in clustering algorithms.
9. Discuss Topic modelling.
10. Define Latent variable models.
Long answer questions:
1. Explain K-Mean Clustering algorithm.
2. Solve K-Mean Clustering algorithm with K=2 for the given data points below:
{(185,72),(170,56),(168,60),(179,68),(182,72),(188,77),(180,71),(180,70),
(183,84),(180,88),(180,67),(177,76)}, where initial centroids are {(185,72),(170,56)}.
3. Apply K-Means Clustering Algorithm for the following dataset and find the three cluster
centers after the second iteration and draw the final three clusters in a 2D Plane. Assume Initial
cluster centers are C1(2, 10), C2(5, 8), C3(1,2).
(Assume Number of Clusters K=3)
Point X Y
Label
A1 2 10
A2 2 5
A3 8 4
A4 5 8
A5 7 5
A6 6 4
A7 1 2
A8 4 9
4. Illustrate Agglomerative Hierarchical Clustering algorithm with suitable example.
5. Differentiate between Agglomerative and Divisive Hierarchical clustering algorithm.
6. Explain Gaussian mixture model with Expectation-Maximization algorithm.
7. Describe Spectral Clustering with an example.
8. Differentiate between classification and clustering.
9. Illustrate Dirichlet Process Mixture Model with an example
10. Apply Agglomerative Hierarchical Clustering for the following dataset to find the optimal
clusters using a single link technique and draw the dendogram of clusters.
26
Example X Y
Number
P1 0.40 0.53
P2 0.22 0.38
P3 0.35 0.32
P4 0.26 0.19
P5 0.08 0.41
P6 0.45 0.30
11. Define Curse of Dimensionality. Consider the following dataset to Compute the Principal
Component using Principal Component analysis algorithm(PCA).
X Y
2 1
3 5
4 3
5 6
6 7
7 8
12. Explain the following Dimensionality Reduction techniques.
i) PCA ii) Linear Discriminant Analysis iii) Singular Value Decomposition (SVD)
13. What is Latent Variable? Explain Latent variable models?
14. What is Topic modeling? Explain working of Latent Dirichlet Allocation method with an
example.
15. Explain unsupervised learning with suitable examples.
Unit 4
Short answer Questions:
1. Define hidden markov model.
2. What are Bayesian networks?
3. Describe markov random fields.
4. What is text classification?
5. Write the assumptions of naïve bayes classifier.
6. What is joint probability distribution?
7. Compare markov model and HMM.
8. List different types of graphical models.
9. Write the limitations of Markov model.
10. Define state, transition and emission probabilities in Hidden Markov Model.
Long answer questions:
1. Describe Bayesian Networks (or) Bayesian Belief Networks.
2. What is Conditional Independence? Why do we need conditional independence?
27
3. Explain about Hidden Markov Model with a neat sketch and write the assumptions of Hidden
Markov Model.
4. Compare Markov Model and Hidden Markov Model.
5. Explain Local Markov Property and Global Markov Property.
6. What are the significance of Markov model and limitations of Markov model?
7. Illustrate the state, transition and emission probabilities in Hidden Markov Model with example.
8. How can we learn the values for the Hidden Markov models(HMM) parameters A and B given
some data?
9. Explain Naive Bayes Classifier algorithm with an example?
10. What is Markov Random Fields Learning? Explain with example.
11. Explain different types of graphical models.
12. Apply Naïve Bayes Classifier algorithm for the following dataset to classify new data instance is
(Color=Red, Type=SUV, Origin=Domestic)
Color Type Origin Stolen Vehicle
Red Sports Domestic YES
Red Sports Domestic NO
Red Sports Domestic YES
Yellow Sports Domestic NO
Yellow Sports Imported YES
Yellow SUV Imported NO
Yellow SUV Imported YES
Yellow SUV Domestic NO
Red SUV Imported NO
Red Sports Imported YES
13. Apply Naïve Bayes Classifier algorithm for the following dataset and classify new data instance is
(Outlook = Sunny, Temperature = Cool, Humidity = High, Wind=Strong )
28
14. Illustrate Naïve Bayes algorithm for learning and classifying text and Apply Naïve Bayes
Classifier algorithm for the following dataset to classify new data instance is ( ― I hated the poor
acting ‖)
Document TEXT in the Document Class
ID
1 I loved the movie YES
2 I hated the movie NO
3 a great movie good movie YES
4 Poor acting NO
5 great acting a good movie YES
toothache ∼ toothache
catch ∼catch catch ∼catch
cavity .108 .012 .072 .008
∼cavity .016 .064 .144 .576
16. Consider the following Bayesian belief network and Calculate the probability that alarm has
sounded, but there is neither a burglary, nor an earthquake occurred, and David and Sophia both called
the Harry.
29
Unit 5
Short answer Questions:
1. Define advanced learning.
2. What is Representation learning?
3. Explain Ensemble Learning methods.
4. Compare boosting and bagging.
5. Define deep learning.
6. What is reinforcement learning?
7. Define neural network.
8. Define cost function.
9. Compare CNN and RNN.
10. Define active learning.
Long answer questions:
1. Explain Reinforcement Learning with suitable example.
2. Explain about Representation learning.
3. Discuss how a multi-layer neural network learns using a Back Propagation algorithm.
4. Explain Ensemble Learning methods and benefits of Ensemble Learning.
5. Differentiate between bagging and boosting.
6. Explain Bootstrap Aggregation with an example.
7. What is Gradient Boosting Machines? Explain with example.
8. What is deep learning? Explain the different types of deep learning techniques.
9. Explain about types of Neural Networks.
10. What is Active learning? Describe the different types of active learning algorithms.
11. Explain how reinforcement learning methods are related to dynamic programming.
12. Explain Temporal Difference learning in Reinforcement Learning.
13. What is the need of advanced learning?
14. Explain Convolution Neural Network (CNN) and Recurrent Neural Network (RNN).
15. Explain simple Perceptron with suitable example.
16. Explain the steps in Back Propagation algorithm.
17. The Neural Network given below takes two binary valued inputs X1, X2 ={ 0,1} and the
activation function is the binary threshold function (h(x)=1 if x>0; 0 otherwise). Which of the
following logical functions does it compute?
30
18. Design a neural network for XOR gate, Logic AND, NAND Gate.
19. Can you construct a neural network for implementing 2 input XOR operation using perceptron?
Justify your answer.
20. Differentiate between Supervise learning and unsupervised learning and reinforcement learning.
21. Explain Gradient Descent algorithm.
22. Explain multi layer Neural Networks.
23. Explain the advantages and disadvantages of Reinforcement Learning.
31