0% found this document useful (0 votes)
64 views

f11 Examtopics

The midterm exam for CMPS 142/242 will take place on November 21st in class. It will be closed-book, though students can bring one handwritten note card. Topics covered on the exam will include supervised and unsupervised learning, noise in data, model selection, feature selection, basic probability, estimating probabilities from data, discriminative and generative models, decision theory, estimating Gaussian distributions, instance-based learning, decision trees, Naive Bayes, linear regression, linear classification, perceptrons, logistic regression, neural networks, support vector machines, clustering, mixtures of Gaussians, and Hidden Markov Models. Students are encouraged to review homework problems and sample exams in preparation.

Uploaded by

Tonydatiger
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views

f11 Examtopics

The midterm exam for CMPS 142/242 will take place on November 21st in class. It will be closed-book, though students can bring one handwritten note card. Topics covered on the exam will include supervised and unsupervised learning, noise in data, model selection, feature selection, basic probability, estimating probabilities from data, discriminative and generative models, decision theory, estimating Gaussian distributions, instance-based learning, decision trees, Naive Bayes, linear regression, linear classification, perceptrons, logistic regression, neural networks, support vector machines, clustering, mixtures of Gaussians, and Hidden Markov Models. Students are encouraged to review homework problems and sample exams in preparation.

Uploaded by

Tonydatiger
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

CMPS 142/242 Midterm Topics (Fall 2011)

The midterm will be in class Monday, November 21. The exam will be closed-book, although students may have one 3x5 card of handwritten notes (both sides). The reading assignments and lecture slides have been posted on the class web page, and reviewing the homework problems is recommended. An old sample exam has been posted. Here are this quarters topics:

1. 2. 3. 4. 5. 6.

What is Machine Learning Supervised batch learning, examples Regression Unsupervised learning Concept Learning: hypothesis class, domain, target, inductive bias, version space Noise and causes (label errors, attribute errors, features or hypothesis class may not fit phenomena exactly) 7. Model Selection, Generalization, and Overfitting Using test sets and cross validation to estimate generalization error 8. Feature Selection/Creation 9. Basic probability (sample space, events, random variables, independence, conditional probability, Bayes rule, sum rule) 10. Estimating probabilities (e.g. the bias of a coin flip): maximum likelihood, priors, maximum aposteriori, mean aposteriori, predictive distribution, and Laplacian estimates of probabilities 11. Hypotheses as models generating the data, the maximum likelihood hypothesis, the maximum aposteriori hypothesis, and the predictive posterior distribution 12. Discriminative and Generative Models 13. Decision Theory with asymmetric losses, Bayes risk, and Bayes optimal predictions 14. Estimating Gaussian distributions: maximum likelihood, biased estimation 15. Prediction by using Gaussians to model class-conditional distributions 16. Instance based learning: Nearest Neighbor, curse of dimensionality, kNN, density estimation, instanced based regression 17. Decision Trees: Greedy construction of trees, information gain criterion, applying a split criterion (impurity function, e.g. information gain) to select tests at nodes, over-fitting and pruning 18. Nave Bayes and its independence assumption, nave Bayes for text classification 19. Linear Regression: least squares as maximum likelihood, bias-variance decomposition, regularization, Bayesian linear regression 20. Linear Classification: add-a-dimension trick, Fishers Linear Discriminant (LDA), 21. Perceptron algorithm and convergence, 22. Logistic regression and softmax function 23. Feed forward artificial neural networks, backprop algorithm (gradient descent) 24. Support vector machines and margin maximization, Lagrangian, dual problem, and kernel functions 25. Clustering: hierarchical vs partitional methods, K-means 26. Mixture of Gaussians: learn with EM, use for clustering and density estimation 27. A very little bit on Hidden Markov Models

You might also like