0% found this document useful (0 votes)
52 views

Course Contents: Study Group - 1 (Concept Focused)

The document outlines a course containing three study groups focused on machine learning. Study group 1 covers the Coursera machine learning course by Andrew Ng over 6 weeks. Study group 2 covers Udemy's machine learning course in Python or R over 8 weeks. Study group 3 combines content from both courses over 8 weeks using only Python.

Uploaded by

Anshit Bansal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views

Course Contents: Study Group - 1 (Concept Focused)

The document outlines a course containing three study groups focused on machine learning. Study group 1 covers the Coursera machine learning course by Andrew Ng over 6 weeks. Study group 2 covers Udemy's machine learning course in Python or R over 8 weeks. Study group 3 combines content from both courses over 8 weeks using only Python.

Uploaded by

Anshit Bansal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Course Contents

As discussed earlier there would be three categories of the study groups. The details and
prescribed timing for completion of each group has been given below.

Study Group -1 (Concept Focused):


We would be going through the Coursera Machine Learning Course by Andrew Ng.

Week 1:
Topics to be covered:

1. Supervised Learning
2. Unsupervised Learning
3. Linear Regression
a. Model Representation
b. Cost Function
4. Gradient Descent
5. Gradient Descent For Linear Regression
6. Linear Algebra Review
7. Multivariate Linear Regression
8. Gradient Descent For Multiple Variables
9. Gradient Descent in Practice
a. Feature Scaling
b. Learning Rates

Week 2:
Topics to be covered:

1. Features and Polynomial Regression


2. Normal Equation for Computing Parameters
3. Classification and Hypothesis Representation
4. Decision Boundary
5. Logistic Regression
a. Cost Function and Gradient Descent
b. Advanced Optimization
6. Multi Class Classification
Week 3:

1. Overfitting
2. Regularization
3. Neural Networks:
a. Non-linear Hypothesis
b. Model Representation
c. Examples and Intuitions

Week 4:
1. Multiclass Classification
2. Cost Function and Back Propagation
3. Implementation: Parameter Unrolling
4. Gradient Checking
5. Random Initialization
6. Autonomous Driving
7. Evaluating a Hypothesis 
8. Model Selection and Train/Validation/Test Sets

Week 5:
1. Diagnosing Bias vs. Variance 
2. Regularization and Bias/Variance 
3. Learning Curves 
4. Prioritizing What to Work On 
5. Error Analysis 
6. Error Metrics for Skewed Classes 
7. Trading Off Precision and Recall 
8. Data For Machine Learning 
Week 6: 

1. Support Vector Machines:


a. Optimization Objective 
b. Large Margin Intuition 
c. Mathematics Behind Large Margin Classification 
d. Kernels  
e. Kernels  
f. Using An SVM 

2. Unsupervised Learning: 
a. Unsupervised Learning: Introduction 
b. K-Means Algorithm 
c. Optimization Objective 
d. Random Initialization 
e. Choosing the Number of Clusters 
Week 7:
 
1. Dimensionality Reduction:
a. Motivation I: Data Compression 
b. Motivation II: Visualization 
c. Principal Component Analysis Problem Formulation 
d. Principal Component Analysis Algorithm 
e. Reconstruction from Compressed Representation 
f. Choosing the Number of Principal Components 
g. Advice for Applying PCA 
2. Anomaly Detection: 
a. Problem Motivation 
b. Gaussian Distribution 
c. Algorithm 
d. Developing and Evaluating an Anomaly Detection System 
e. Anomaly Detection vs. Supervised Learning 
f. Choosing What Features to Use 
g. Multivariate Gaussian Distribution 
h. Anomaly Detection using the Multivariate Gaussian Distribution 
 
 
 
Week 8:

1. Large Scale Machine Learning


a. Learning With Large Datasets 
b. Stochastic Gradient Descent 
c. Mini-Batch Gradient Descent 
d. Stochastic Gradient Descent Convergence 
e. Online Learning 
f. Map Reduce and Data Parallelism 
2. ​Application Example: Photo OCR
a. Problem Description and Pipeline 
b. Sliding Windows 
c. Getting Lots of Data and Artificial Data 
d. Ceiling Analysis: What Part of the Pipeline to Work on Next 
 
 

 
Study Group- 2 (Application Focus):
We are focussing on Udemy’s Machine Learning A-Z™: Hands-On Python & R In Data Science
in this category.
Note:​ You can choose Python or R for implementation of different topics.
Week 1:
1. Data Preprocessing
Regression
2. Simple Linear Regression in Python
3. Multiple Linear Regression
4. Polynomial Regression in Python
5. Support Vector Regression
Week 2:
1. Decision Tree Regression
2. Random Forest Regression
3. Evaluating Regression Models Performance
Classification
4. Logistic Regression
5. K-Nearest Neighbors (K-NN)
6. Support Vector Machine (SVM)
7. Kernel SVM
Week 3:
1. Naive Bayes
2. Decision Tree
3. Random Forest Classification
4. Evaluating Classification Models Performance
Week 4:
Clustering
1. K Means Clustering
2. Hierarchical Clustering
Week 5:
Association Rule Learning
1. Apriori
2. Eclat
Reinforcement Learning
3. Upper Confidence Bound
Week 6:
1. Thompson Sampling
2. Natural Language Processing
Week 7:
Deep Learning
1. Artificial Neural Networks
2. Convolution Neural Networks
Week 8:
Dimensionality Reduction
1. Principal Component Analysis (PCA)
2. Linear Discriminant Analysis (LDA)
3. Kernel PCA
Model Selection and Boosting
1. Model Selection
2. XGBoost

Study Group- 3 (All Round Skill Development):


We would be going through the Coursera Machine Learning Course by Andrew Ng and
Udemy’s Machine Learning A-Z™: Hands-On Python & R In Data Science in this category.

Note:​ We would just be working with Python so you can ignore R implementation of different
topics.

Week 1:
Topics to be covered:

Coursera:
1. Supervised Learning
2. Unsupervised Learning
3. Linear Regression
a. Model Representation
b. Cost Function
4. Gradient Descent
5. Gradient Descent For Linear Regression
6. Linear Algebra Review
7. Multivariate Linear Regression
8. Gradient Descent For Multiple Variables
9. Gradient Descent in Practice
a. Feature Scaling
b. Learning Rates
Udemy:
10. Data Preprocessing
Regression
11. Simple Linear Regression in Python
12. Multiple Linear Regression
13. Polynomial Regression in Python
14. Support Vector Regression

Week 2:
Topics to be covered:
Coursera:
7. Features and Polynomial Regression
8. Normal Equation for Computing Parameters
9. Classification and Hypothesis Representation
10. Decision Boundary
11. Logistic Regression
a. Cost Function and Gradient Descent
b. Advanced Optimization
12. Multi Class Classification
Udemy:
13. Decision Tree Regression
14. Random Forest Regression
15. Evaluating Regression Models Performance
Classification
16. Logistic Regression
17. K-Nearest Neighbors (K-NN)
18. Support Vector Machine (SVM)
19. Kernel SVM
Week 3:
Coursera:
20. Overfitting
21. Regularization
22. Neural Networks:
a. Non-linear Hypothesis
b. Model Representation
c. Examples and Intuitions
Udemy:
23. Naive Bayes
24. Decision Tree
25. Random Forest Classification
26. Evaluating Classification Models Performance
Week 4:
Coursera:
27. Multiclass Classification
28. Cost Function and Back Propagation
29. Implementation: Parameter Unrolling
30. Gradient Checking
31. Random Initialization
32. Autonomous Driving
33. Evaluating a Hypothesis 
34. Model Selection and Train/Validation/Test Sets 
Udemy: 
Clustering
35. K Means Clustering
36. Hierarchical Clustering 

Week 5:
Coursera:
37. Diagnosing Bias vs. Variance 
38. Regularization and Bias/Variance 
39. Learning Curves 
40. Prioritizing What to Work On 
41. Error Analysis 
42. Error Metrics for Skewed Classes 
43. Trading Off Precision and Recall 
44. Data For Machine Learning 
Udemy: 
Association Rule Learning
45. Apriori
46. Eclat
Reinforcement Learning
47. Upper Confidence Bound 
 
Week 6: 
Coursera:
48. Support Vector Machines:
a. Optimization Objective 
b. Large Margin Intuition 
c. Mathematics Behind Large Margin Classification 
d. Kernels  
e. Using An SVM 

49. Unsupervised Learning: 


f. Unsupervised Learning: Introduction 
g. K-Means Algorithm 
h. Optimization Objective 
i. Random Initialization 
j. Choosing the Number of Clusters 
Udemy: 
50. Thompson Sampling
51. Natural Language Processing

Week 7:
Coursera: 
52.Dimensionality Reduction:
i. Motivation I: Data Compression 
j. Motivation II: Visualization 
k. Principal Component Analysis Problem Formulation 
l. Principal Component Analysis Algorithm 
m. Reconstruction from Compressed Representation 
n. Choosing the Number of Principal Components 
o. Advice for Applying PCA 
 
53. Anomaly Detection: 
p. Problem Motivation 
q. Gaussian Distribution 
r. Algorithm 
s. Developing and Evaluating an Anomaly Detection System 
t. Anomaly Detection vs. Supervised Learning 
u. Choosing What Features to Use 
v. Multivariate Gaussian Distribution 
w. Anomaly Detection using the Multivariate Gaussian Distribution 
Udemy:
Deep Learning
54. Artificial Neural Networks
55. Convolution Neural Networks
Week 8:
Coursera:
56. ​Large Scale Machine Learning
g. Learning With Large Datasets 
h. Stochastic Gradient Descent 
i. Mini-Batch Gradient Descent 
j. Stochastic Gradient Descent Convergence 
k. Online Learning 
l. Map Reduce and Data Parallelism 
57. A
​ pplication Example: Photo OCR
e. Problem Description and Pipeline 
f. Sliding Windows 
g. Getting Lots of Data and Artificial Data 
h. Ceiling Analysis: What Part of the Pipeline to Work on Next 
Udemy: 
Dimensionality Reduction
4. Principal Component Analysis (PCA)
5. Linear Discriminant Analysis (LDA)
6. Kernel PCA
Model Selection and Boosting
3. Model Selection
4. XGBoost
 

You might also like