DEPARTMENT OF INFORMATION TECHNOLOGY
COURSE PLAN-THEORY
Course Code AD3451
Course Name MACHINE LEARNING
Regulation 2021
Name of the Course Instructor(s) [Link]
Name of the Course Coordinator
Academic Year: 2024-25 (EVEN)
Branch / Year / Semester IT/II/IV
Date of Commencement of Class 21.02.2025
Date of Completion of Class
Revision No 1
Prepared By, Approved By,
[Link]
AP/IT HoD/IT
SYLLABUS
1
COURSE
COURSE NAME L T P C
CODE
AL3451 MACHINE LEARNING 3 0 0 3
COURSE OBJECTIVES :
● To understand the basic concepts of machine learning.
● To understand and build supervised learning models.
● To understand and build unsupervised learning models.
● To evaluate the algorithms based on corresponding metrics identified
UNIT I INTRODUCTION TO MACHINE LEARNING 8
Review of Linear Algebra for machine learning; Introduction and motivation for machine learning; Examples of machine
learning applications, Vapnik-Chervonenkis (VC) dimension, Probably Approximately Correct (PAC) learning, Hypothesis
spaces, Inductive bias, Generalization, Bias variance trade-off.
UNIT II SUPERVISED LEARNING 11
Linear Regression Models: Least squares, single & multiple variables, Bayesian linear regression, gradient descent, Linear
Classification Models: Discriminant function – Perceptron algorithm, Probabilistic discriminative model - Logistic
regression, Probabilistic generative model – Naive Bayes, Maximum margin classifier – Support vector machine, Decision
Tree, Random Forests
UNIT III ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 9
Combining multiple learners: Model combination schemes, Voting, Ensemble Learning - bagging, boosting, stacking,
Unsupervised learning: K-means, Instance Based Learning: KNN, Gaussian mixture models and Expectation maximization.
UNIT IV NEURAL NETWORKS 9
Multilayer perceptron, activation functions, network training – gradient descent optimization – stochastic gradient descent,
error backpropagation, from shallow networks to deep networks –Unit saturation (aka the vanishing gradient problem) –
ReLU, hyperparameter tuning, batch normalization, regularization, dropout.
UNIT V DESIGN AND ANALYSIS OF MACHINE LEARNING EXPERIMENTS 8
Guidelines for machine learning experiments, Cross Validation (CV) and resampling – K-fold CV, bootstrapping, measuring
classifier performance, assessing a single classification algorithm and comparing two classification algorithms – t test,
McNemar’s test, K-fold CV paired t test
TOTAL: 45 Periods
CONTENT BEYOND SYLLABI: Cascading
COURSE OUTCOMES:
At the end of this course, the students will be able to:
CO1 : Explain the basic concepts of machine learning.
CO2 : Construct supervised learning models.
CO3 : Construct unsupervised learning algorithms.
CO4 : Evaluate and compare different models
TEXT BOOKS:
2
T1. Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, Fourth Edition, 2020
T2. Stephen Marsland, “Machine Learning: An Algorithmic Perspective, “Second Edition”,
CRC Press, 2014.
.
REFERENCE BOOKS/LINKS:
R1. Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2006.
R2. Tom Mitchell, “Machine Learning”, McGraw Hill, 3rd Edition, 1997.
R3. Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine
Learning”, Second Edition, MIT Press, 2012, 2018.
R4. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016
R5. Sebastain Raschka, Vahid Mirjalili , “Python Machine Learning”, Packt publishing 3rd
Edition, 2019.
PLAN OF DELIVERY
Ref. Cumulat
Sl. Topic Covered Teaching Teaching
Book Page No Hours ive
No Aid Methodology
Code Hours
UNIT I – INTRODUCTION TO MACHINE LEARNING 8
1. Review of Linear Algebra T1 9 1 1 BB -
for Machine Learning
2. Introduction and Motivation T2 4 1 2 BB -
for Machine Learning
3. Examples of Machine T2 5 1 3 BB -
Learning applications
4. Vapnik-Chervonenkis (VC) T1 27 1 4 BB -
dimension
5. Probably Approximately T1 29 1 5 BB -
Correct (PAC) learning
6. Hypothesis spaces- T1 34 1 6 BB -
Inductive bias
7. Generalization T1 37 1 7 BB -
8. Bias variance trade-off T1 39 1 8 BB -
UNIT-II REGULAR EXPRESSIONS AND LANGUAGES 11
Linear Regression Models: T1 61 BB -
9. Least squares- 1 9
Single & multiple variables T1 85 BB -
10. Bayesian linear regression T1 173 2 11 BB -
11. Gradient descent T1 201 1 12 BB -
Linear Classification T1 216
12. Models: Discriminant 2 14 BB -
function T2 130
3
Ref. Cumulat
Sl. Topic Covered Teaching Teaching
Book Page No Hours ive
No Aid Methodology
Code Hours
Think Pair
13. Perceptron algorithm T2 43 2 16 BB+PPT
Share
14. Probabilistic discriminative T2 66 1 17 BB
model - Logistic regression
15. Probabilistic generative T1 396 1 18 BB
model
16. Naive Bayes T2 30 2 20 BB
17. Maximum margin classifier T2 169 1 21 BB
– Support vector machine
18. Decision Tree T2 250 1 22 BB
19. Random Forests T2 275 1 23 BB
UNIT III – ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 9
Combining multiple
20. learners: Model T1 491 1 24 BB -
combination schemes
21. Voting T1 492 1 25 BB -
22. Ensemble Learning - T2 274 1 26 BB -
bagging
23. Boosting T2 268 1 27 BB -
24. Stacking T1 504 1 28 BB -
Unsupervised learning - K- T2 281 BB -
25. 1 29
means T1 163 BB -
26. Instance Based Learning: T1 190 1 30 BB -
KNN
27. Gaussian mixture models T2 153 1 31 PPT+BB -
28. Expectation maximization T2 154 1 32 BB -
UNIT IV – NEURAL NETWORKS 9
29. Multilayer perceptron, T1 279 1 33 BB -
activation functions
Network training - T1 274 BB -
30. gradient descent 1 34
optimization T1 248 BB -
31. Stochastic gradient descent T1 275 1 35 BB -
Error backpropagation, from
32. shallow networks to deep T1 283 BB -
networks
1 36
Unit saturation (aka the
33. vanishing gradient problem) T1 287 BB -
– ReLU
34. Hyperparameter tuning- T1 297 1 37 BB -
4
Ref. Cumulat
Sl. Topic Covered Teaching Teaching
Book Page No Hours ive
No Aid Methodology
Code Hours
Batch normalization Web Reference BB -
Adaptive
T1 301 BB+PPT
35. Regularization- 1 38 Learning
Dropout
T1 309
UNIT V DESIGN AND ANALYSIS OF MACHINE LEARNING EXPERIMENTS 8
Guidelines for machine Flipped Class
36. T1 547 1 39 BB
learning experiments room
Cross Validation (CV) - T1 560 BB -
37. 1 40
Resampling
T1 558 PPT+BB -
38. K-fold CV T1 559 1 41 PPT+BB -
39. Bootstrapping T1 561 1 42 BB -
40. Measuring classifier T1 562 1 43 BB Flip Learning
performance
Assessing a single
classification algorithm and
41. comparing two T1 570 BB -
1 44
classification algorithms – t
test
42. McNemar’s test T1 573 BB -
43. K-fold CV paired t test T1 574 1 45 BB -
Content Beyond Syllabus
Web Quiz in Quick
44. Cascading NA 1 46 PPT+BB
material learn
ASSESSMENT PLAN
ASSESSMENT SCHEDULE-TEST
ASSESSMENT SCHEDULE-ASSIGNMENT
PORTION FOR DATE
TEST TEST NO.
TEST PLANNED CONDUCTED
Class Test I I
Internal Assessment Test I I
Class Test II II
Internal Assessment Test II II
Group Date of Submission
Assignment No Mode
/Common/Individual
I Seminar Individual
II Written Individual
5
ASSESSMENT PATTERN
ITEM WEIGHTAGE
Continuous Assessment-I 40
Internal Assessment Test – I 60
40
Continuous Assessment-II 40
Internal Assessment Test – II 60
End Semester Examination 60
Total 100