0% found this document useful (0 votes)
3 views

06b Discriminant Analysis

Uploaded by

dhirendrakumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

06b Discriminant Analysis

Uploaded by

dhirendrakumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 18

Classification

Discriminant Analysis

slides thanks to Greg Shakhnarovich (CS195-5, Brown Univ., 2006)

Jeff Howbert Introduction to Machine Learning Winter 2012 1


Jeff Howbert Introduction to Machine Learning Winter 2012 2
Jeff Howbert Introduction to Machine Learning Winter 2012 3
Jeff Howbert Introduction to Machine Learning Winter 2012 4
Jeff Howbert Introduction to Machine Learning Winter 2012 5
Jeff Howbert Introduction to Machine Learning Winter 2012 6
Jeff Howbert Introduction to Machine Learning Winter 2012 7
Jeff Howbert Introduction to Machine Learning Winter 2012 8
Jeff Howbert Introduction to Machine Learning Winter 2012 9
Jeff Howbert Introduction to Machine Learning Winter 2012 10
Jeff Howbert Introduction to Machine Learning Winter 2012 11
Example of applying Fisher’s LDA

maximize separation of means maximize Fisher’s LDA criterion


 better class separation

Jeff Howbert Introduction to Machine Learning Winter 2012 12


Using LDA for classification in one
dimension

 Fisher’s LDA gives an optimal choice of w, the vector for


projection down to one dimension.
 For classification, we still need to select a threshold to
compare projected values to. Two possibilities:
– No explicit probabilistic assumptions. Find threshold
which minimizes empirical classification error.
– Make assumptions about data distributions of the
classes, and derive theoretically optimal decision
boundary.
 Usual choice for class distributions is multivariate Gaussian.
 We also will need a bit of decision theory.

Jeff Howbert Introduction to Machine Learning Winter 2012 13


Decision theory

To minimize classification error:

At a given point x in feature space,


yˆ arg max p (C | x)  choose as the predicted class the class
C
that has the greatest probability at x.

Jeff Howbert Introduction to Machine Learning Winter 2012 14


Decision theory

At a given point x in feature space,


yˆ arg max p (C | x)  choose as the predicted class the class
C
that has the greatest probability at x.

probability densities for classes C1 and C2 relative probabilities for classes C1 and C2

Jeff Howbert Introduction to Machine Learning Winter 2012 15


MATLAB interlude

Classification via discriminant analysis,


using the classify() function.
Data for each class modeled as multivariate Gaussian.

matlab_demo_06.m

class = classify( sample, training, group, ‘type’ )

predicted test test data training data training model for class
labels labels covariances

Jeff Howbert Introduction to Machine Learning Winter 2012 16


MATLAB classify() function

Models for class covariances

‘linear’: ‘diaglinear’:
all classes have same covariance matrix all classes have same diagonal covariance matrix
 linear decision boundary  linear decision boundary

‘quadratic’: ‘diagquadratic’:
classes have different covariance matrices classes have different diagonal covariance matrices
 quadratic decision boundary  quadratic decision boundary

Jeff Howbert Introduction to Machine Learning Winter 2012 17


MATLAB classify() function

Example with ‘quadratic’ model of class covariances

Jeff Howbert Introduction to Machine Learning Winter 2012 18

You might also like