0% found this document useful (1 vote)
99 views4 pages

ML Question Bank for Semester End Examination[Final]

The document is a question bank for the Semester End Examination (SEE) for the Machine Learning course in the B. Tech program under the KGR21 regulation for the academic year 2024-25. It includes short answer and long answer questions categorized by Bloom's Taxonomy levels and course outcomes, covering various topics such as decision trees, neural networks, Bayesian learning, genetic algorithms, and analytical learning. The course coordinator is Mr. CH Ranga Swami, Assistant Professor.

Uploaded by

Aarthik Buguda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
99 views4 pages

ML Question Bank for Semester End Examination[Final]

The document is a question bank for the Semester End Examination (SEE) for the Machine Learning course in the B. Tech program under the KGR21 regulation for the academic year 2024-25. It includes short answer and long answer questions categorized by Bloom's Taxonomy levels and course outcomes, covering various topics such as decision trees, neural networks, Bayesian learning, genetic algorithms, and analytical learning. The course coordinator is Mr. CH Ranga Swami, Assistant Professor.

Uploaded by

Aarthik Buguda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Question Bank for Semester End Examination (SEE)

Year/Sem: III/I Program: B. Tech


Regulation: KGR21
Name of the Course: Machine Learning Course Code: KG21CM502

Branch: CSE(AI&ML) Academic Year: 2024-25


Name & Details of the Course Coordinator: Mr. CH Ranga Swami, Assistant Professor

K1-Remembering; K2-Understanding; K3-Applying; K4-Analyzing; K4-Evaluating; K6-Creating

Short Answer Questions

Each Question Carriers two Marks:

Bloom’s
Course
Q. No. Question Taxonomy Marks Unit No.
Outcome
Level
Define a well-posed learning problem 2
1. CO1 K1 Unit-I
with an example. Marks
State the concept of inductive bias in 2
2. CO1 K1 Unit-I
machine learning. Marks
Explain the Find-S algorithm with a 2
3. CO1 K1 Unit-I
simple example. Marks
Find the maximally specific hypothesis
2
4. using the Find-S algorithm for a given CO1 K1 Unit-I
Marks
set of training data.
Define the version space in concept 2
5. CO1 K1 Unit-I
learning. Marks
Define a neural network. 2
6. CO2 K1 Unit-II
Marks
State the role of perceptrons in neural 2
7. CO2 K1 Unit-II
networks. Marks
Explain the significance of the back-
2
8. propagation algorithm in training neural CO2 K1 Unit-II
Marks
networks.
Identify appropriate problems for neural 2
9. CO2 K1 Unit-II
network learning. Marks
Define the concept of hypothesis 2
10. CO2 K1 Unit-II
evaluation in machine learning. Marks
11. CO3 K1 2 Unit-III
Define Bayesian learning.
Marks
12. State Bayes’ theorem and its application CO3 K1 2 Unit-III
in machine learning. Marks
13. Explain the concept of maximum CO3 K1 2 Unit-III
likelihood hypothesis. Marks
Mid -II
14. Identify the role of the Naïve Bayes CO3 K1 2 Unit-III
classifier in text classification tasks. Marks
15. Define the EM algorithm and its CO3 K1 2 Unit-III
significance in machine learning. Marks
16. Define genetic algorithms and their CO4 K1 2 Unit-IV
application in machine learning. Marks
17. State the key principles behind CO4 K1 2 Unit-IV
reinforcement learning. Marks
18. Explain the process of hypothesis space CO4 K1 2 Unit-IV
search in genetic algorithms. Marks
19. Identify the role of Q-learning in CO4 K1 2 Unit-IV
reinforcement learning tasks. Marks
20. Define the concept of non-deterministic CO4 K1 2 Unit-IV
rewards and actions in reinforcement Marks
learning.
21. Define analytical learning and its role in CO5 K1 2 Unit-V
machine learning. Marks
22. State the significance of using prior CO5 K1 2 Unit-V
knowledge in analytical learning. Marks
23. Explain the PROLOG-EBG approach in CO5 K1 2 Unit-V
explanation-based learning. Marks
24. Identify the challenges in combining CO5 K1 2 Unit-V
inductive and analytical learning Marks
methods.
25. Define the concept of explanation-based CO5 K1 2 Unit-V
generalization (EBG) in machine Marks
learning.

Long Answer Questions


Each Question Carriers ten Marks:

Bloom’s
Course
Q. No. Question Taxonomy Marks Unit -No
Outcome
Level
State and explain the general-to-specific
1. ordering in concept learning with CO2 K5 10 Marks Unit-I
examples.
Using the Candidate Elimination
2. algorithm, find the version space for a CO2 K4 10 Marks Unit-I
given set of hypotheses.
Derive an expression for the inductive
3. bias and explain its significance in CO2 K4 10 Marks Unit-I
decision tree learning.
Using an example, illustrate the process
4. of constructing a decision tree from CO2 K4 10 Marks Unit-I
training data.
5. Draw the decision tree for a given CO2 K5 10 Marks Unit-I
dataset and discuss the issues related to
decision tree learning..
Explain the architecture of a multilayer
6. neural network and the process of back- CO3 K5 10 Marks Unit-II
propagation.
Derive the equations used in the back-
7. propagation algorithm for updating the CO3 K4 10 Marks Unit-II
weights in a neural network.
Analyze the challenges faced by neural
8. networks in learning from complex CO3 K4 10 Marks Unit-II
datasets.
Compare and contrast the performance
9. of different neural network architectures CO3 K4 10 Marks Unit-II
in solving real-world problems.
Evaluate the effectiveness of hypothesis
10. evaluation techniques in improving the CO3 K5 10 Marks Unit-II
accuracy of machine learning models.
Explain the process of Bayesian
11. CO4 K5 10 Marks Unit-III
learning with the help of an example.
Derive the maximum likelihood
12. hypothesis for predicting probabilities CO4 K4 10 Marks Unit-III
in machine learning.
Analyze the role of Bayesian belief
13. networks in capturing uncertainty in CO4 K4 10 Marks Unit-III
machine learning models.
Mid - II
Compare the performance of the Naïve
14. Bayes classifier with other classification CO4 K4 10 Marks Unit-III
algorithms in text classification.
Evaluate the effectiveness of the EM
15. algorithm in handling missing data in CO4 K5 10 Marks Unit-III
machine learning.
Explain the working of genetic
16. CO3 K5 10 Marks Unit-IV
algorithms with an example.
Derive the fitness function used in a
17. genetic algorithm for solving CO4 K4 10 Marks Unit-IV
optimization problems.
Analyze the relationship between
18. genetic algorithms and evolutionary CO4 K4 10 Marks Unit-IV
learning models.
Compare the effectiveness of rule-based
19. learning with genetic algorithms in CO4 K4 10 Marks Unit-IV
complex problem-solving scenarios.
Evaluate the impact of reinforcement
20. learning in improving the performance CO4 K5 10 Marks Unit-IV
of intelligent agents.
Derive the equation for explanation-
21. based generalization in analytical CO5 K4 10 Marks Unit-V
learning.
Analyze the challenges and benefits of
22. combining inductive and analytical CO5 K4 10 Marks Unit-V
learning methods.
23. Compare the effectiveness of analytical CO5 K4 10 Marks Unit-V
learning and inductive learning in
different machine learning tasks.
Evaluate the role of prior knowledge in
enhancing the performance of machine
24. CO5 K5 10 Marks Unit-V
learning models through analytical
learning.
Explain the process of analytical
25. CO5 K5 10 Marks Unit-V
learning with an example.

You might also like