100% found this document useful (1 vote)
14 views

Machine Learning For Intelligent System

Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
100% found this document useful (1 vote)
14 views

Machine Learning For Intelligent System

Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 3
CSE Course Curriculum 4 UNIVERSITY auorronre aa BTCS705 Machine Learning for Intelligent Litflele System 3 po [24 Total Credits:4 | Total Hours in semester :45 1__| Course Pre-requisites: 2 | Course Category: Professional Courses 3 | Course Revision/ Approval Date: 4 | Course Objectives : 4.1. Understand the basic concept of machine learning, 4.2. Understand the basic skills to decide which learning algorithm to use for what problem, 4.3. Able code up your own learning algorithm and evaluate and debug it, 4.4, Understand various kernel methods and able to create the own kernals 4.5. Get better understanding about deep learning and Artificial neural network. Course Content Weightage Contact hours Pedagogy Unit 1: Introduction Introduction to Machine Learning, Supervised earning setup: Feature vectors, 0/1 loss, squared Joss, absolute loss, Train / Test split K nearest Neighbor: Hypothesis classes, Nearest Neighbor Classifier, Sketch of Covert and Hart proof (that 1-NN converges to at most 2xB: Error in the sample limit), Curse of Dimensionality Perception: Linear Classifiers , Perceptron convergence proof Estimating Probabilities from data: MLE, MAP, Bayesian vs. Frequentist statistics, 30% 07 Chalk-Duster, PPT, Notes Unit 2: Naive Bayes and Gradient Descent Naive Bayes Assumption, Logistic Regression Logistic Regression formulation, Relationship of LR with Naive Bayes, Gradient Descent: Taylor's Expansion Proof that GD decreases with every step if step size is small enough, Newton's Method Linear Regression 20% 10 Chalk-Duster, PPT, Notes Unit 4: Linear SVM: ‘margin of a hyperplane classifier, derive a max margin classifier Empirical Risk Minimization: Setup of loss funetion and _regularizer, classification loss functions: hinge-loss, log-loss, zero-one loss, exponential, regression loss functions: absolute loss, squared loss, huber loss, log-cosh, Special cases: OLS, Ridge regression, Lasso, Logistic Regression 10 PPT, Note: se Cur CSE culum GSFC ‘Computer Science 4 4 UNIVERSITY vorbis ay Underfitting: — k-fold cross validation, regularization, debug ML algorithms Bias / Variance Trade-off: Noise, how error decomposes into Bias, Variance, Noise Unit -4 20% 10 | Chatk-Duster, Kernels (reducing Bias):RBF Kemel, Polynomial Kemel, Linear Keel PPT, Notes Constructing new kemels, Kernel SVM Gaussian Processes, Bayesian Global Optimization: Properties of Gaussian Distributions, Gaussian Processes (Assumptions), GPs are kemel machines K-nearest neighbors data structures: How to construct and use a KD-Tree, how to construct and use a Ball-tree, comparison of KD-T over B-T and vice versa Decision / Regression ‘Trees: ID3 algorithm, Gini Index, Entropy splitting function, Regression Trees Unit 5: 20% | 08 | Chalk-Duster, Bagging, Bootstrapping, Random Forests Boosting: weak learner (what is the assumption PPT, Notes made here), reduction of bias, The Any boost algorithm, The Gradient boost algorithm, The Ada boost algorithm, The derivation to compute the weight of a weak learer Artificial Neural Networks / Deep Learning: Forward propagation, Backward propagation, squashing function, Convolutional Neural Networks, Convolution / Pooling Operation, Batch Normalization: Res Nets, Dense Nets Learning Resources 1. ‘Textbooks: 1, Machine Learning A Probabilistic Perspective by Kevin P. Murpi, 2, Reference Books: I, Hastie, Tibshirani, Friedman The Elements of Statistical Learning. 3, Journals & Periodicals: 5. Other Electronic Resources: Evaluation Scheme _ | Total Marks Mid semester Marks | 20 CSE Course Curriculum 4 UNIVERSITY SuTRre om End Semester Marks | 30 ‘Attendance 5 Quiz 5 Continuous Evaluation | | Skill enhancement activities / case Marks study Presentation/ miscellaneous 5 activities 1 Evaluate and debug your own learning algorithm. 2. Apply the learning algorithms on various types of problems. Course Outcomes 3. Apply various decision leaming algorithms on problems. 4, Able to construct and use a KD tree for problem solvinj 5. Apply Gradient Descent method and able to identify that GD cry step if step size is small enough decreases with ©

You might also like