Machine Learning-Lecture 03
Machine Learning-Lecture 03
LECTURE – 03
K-NEAREST-NEIGHBOR
ALGORITHM
K Nearest Neighbor Classification
K-Nearest Neighbors or KNN is one of the simplest machine
learning algorithms.
◦ easy to implement
◦ easy to understand.
data to this algorithm, the algorithm does not train itself at all.
KNN does not learn any discriminative function from the training
The table above represents our data set. We have two columns
How to Calculate Euclidean Distance in the K-Nearest Neighbors
Algorithm
Here's the new data entry:
To know its class, we have to calculate the distance from the new entry to
other entries in the data set using the Euclidean distance formula.
Here's what the table will look like after all the
distances have been calculated:
Since we chose 5 as the value of K, we'll only
consider the closest five records. That is:
As you can see above, the majority class within the 5
nearest neighbors to the new entry is Red. Therefore,
we'll classify the new entry as Red.
KNN - EXAMPLE
HOW TO SELECT THE VALUE OF
K
There is no particular way of choosing the value K, but here are
some common conventions to keep in mind:
Use the K that gives lowest average error over the N training examples
PROS AND CONS OF KNN
Pros
Cons
THANK YOU!