Classification Matrics
Classification Matrics
USING SCIKITLEARN:
Terminology:
TRUE POSITIVE
TRUE NEGATIVE
FALSE POSITIVE
FALSE-NEGATIVE
Actual Values
Setosa 5 0 0
Accuracy
Vesicolor 0 3 1
Verginica 0 1 5
Precision
Precision is the ratio of true positives (the number of
correctly classified positive instances) to the total
number of positive predictions made by the model.
𝑇 𝑟𝑢𝑒 𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛=
𝑇𝑟𝑢𝑒 𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒+ 𝐹𝑎𝑙𝑠𝑒 𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒
Recall:
Recall is the ratio of true positives to the total
number of actual positive instances in the
dataset.
𝑇 𝑟𝑢𝑒 𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒
𝑅𝑒𝑐𝑎𝑙𝑙=
𝑇𝑟𝑢𝑒 𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒+ 𝐹𝑎𝑙𝑠𝑒 𝑁𝑒𝑔𝑎𝑡𝑖𝑣𝑒
F1 score
The F1 score is the harmonic mean of precision
and recall. It is calculated as 2*(precision * recall)
/ (precision + recall).
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛∗ 𝑅𝑒𝑐𝑎𝑙𝑙
𝑓 1𝑠𝑐𝑜𝑟𝑒 =2
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛+ 𝑅𝑒𝑐𝑎𝑙𝑙
ROC And Auc:
The ROC (Receiver Operating Characteristic) curve is a plot of
the true positive rate (TPR) against the false positive rate
(FPR) for different threshold values. It is used to evaluate the
performance of a classification model at different
classification thresholds.
𝑇𝑃
FORMULA:
𝑇𝑃𝑅=
𝑇𝑃 + 𝐹𝑁
𝐹𝑃
𝐹 𝑃𝑅 =
𝐹𝑃 + 𝑇𝑁
Thank You!