Lecture 12 Dr. Lamiaa
Lecture 12 Dr. Lamiaa
Bayesian classification is a probabilistic approach based on Bayes' Theorem, which provides a principled
way of making decisions under uncertainty.
Understanding Bayes' Theorem:
Bayes' Theorem. Named after Reverend Thomas Bayes, this theorem describes the probability of an
event, based on prior knowledge of conditions that might be related to the event. Mathematically,
it's expressed as:
The fundamental Naive Bayes assumption is that each feature makes an:
•Feature independence: The features of the data are conditionally independent of each
other, given the class label.
•Features are equally important: All features are assumed to contribute equally to the
prediction of the class label.
•No missing data: The data should not contain any missing values.
Example 1
Example 1: Text Classification with Naive Bayes
Consider a spam detection problem where we classify emails as 'spam' or 'not spam'. Our
features are words in the email, and the classes are 'spam' (S) and 'not spam' (¬S).
Example 2
X = (Rainy, Hot, High, False)
y = No
today = (Sunny, Hot, Normal, False)
Evaluating your Naïve Bayes classifier