0% found this document useful (0 votes)
45 views

Program Name: B.Tech CSE Semester: 5th Course Code:PEC-CS-D-501 (I) Facilitator Name: Aastha

This document discusses machine learning course content related to classification using Naive Bayes. It includes definitions of conditional probability, Bayes' theorem, and how a Naive Bayes classifier works. Specifically: 1) It explains conditional probability and provides examples to illustrate how the probability of one event changes given occurrence of another event. 2) It defines Bayes' theorem which describes the probability of an event based on prior knowledge of conditions related to the event. 3) It discusses how a Naive Bayes classifier uses Bayes' theorem for classification, making the assumption that features are independent of each other. The classifier predicts class membership probabilities and assigns the class with highest probability.

Uploaded by

Aastha Kohli
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views

Program Name: B.Tech CSE Semester: 5th Course Code:PEC-CS-D-501 (I) Facilitator Name: Aastha

This document discusses machine learning course content related to classification using Naive Bayes. It includes definitions of conditional probability, Bayes' theorem, and how a Naive Bayes classifier works. Specifically: 1) It explains conditional probability and provides examples to illustrate how the probability of one event changes given occurrence of another event. 2) It defines Bayes' theorem which describes the probability of an event based on prior knowledge of conditions related to the event. 3) It discusses how a Naive Bayes classifier uses Bayes' theorem for classification, making the assumption that features are independent of each other. The classifier predicts class membership probabilities and assigns the class with highest probability.

Uploaded by

Aastha Kohli
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Program Name : B.

Tech CSE
Semester : 5th
Course Name: Machine Learning
Course Code:PEC-CS-D-501 (I)
Facilitator Name: Aastha
Classification
Discussion on : Classification by Naïve Bayes
Contents

● What is Conditional

● Probability ? What is
● Bayes Theorem?
What is NAIVE BAYES
CLASSIFIER?
Types of Naive Bayes
Algorithm.
Classification as supervised learning
Unsupervised Classification
Conditional Probability
1. In probability theory, conditional probability is a measure of the probability
of an event given that another event has already occurred.

2. If the event of interest is A and the event B is assumed to have occurred,


"the conditional probability of A given B", or "the probability of A under the
condition B", is usually written as P(A|B), or sometimes PB(A).
Examples
Chances of cough

The probability that any given person has a cough on any given day maybe
only 5%. But if we know or assume that the person has a cold, then they are
much more likely to be coughing. The conditional probability of coughing
given that person have a cold might be a much higher 75%.
Marbles in a Bag

2 blue and 3 red marbles are in a bag.

What are the chances of getting a blue marble?

???
Marbles in a
Bag
2 blue and 3 red marbles are in a bag.

What are the chances of getting a blue marble?

Answer: - The chance is 2 in 5


Bayes Theorem
1. In probability theory and statistics, Bayes’ theorem (alternatively Bayes’
law or Bayes' rule) describes the probability of an event, based on prior
knowledge of conditions that might be related to the event.

2. For example, if cancer is related to age, then, using Bayes’ theorem, a


person’s age can be used to more accurately to assess the probability
that they have cancer, compared to the assessment of the probability of
cancer made without knowledge of the person's age.
Classification by Baye’s
The Formula for Bayes’
theorem
where

1. P(H) is the probability of hypothesis H being true. This is known as the


prior probability.
2. P(E) is the probability of the evidence(regardless of the hypothesis).
3. P(E|H) is the probability of the evidence given that hypothesis is true.
4. P(H|E) is the probability of the hypothesis given that the evidence is
there.
NAIVE BAYES
CLASSIFIER
● Naive Bayes is a kind of classifier which uses the Bayes Theorem.

● It predicts membership probabilities for each class such as the probability


that given record or data point belongs to a particular class.

● The class with the highest probability is considered as the most likely
class. This is also known as Maximum A Posteriori (MAP).
Assumption
Naive Bayes classifier assumes that all the features are unrelated to
each other. Presence or absence of a feature does not influence the
presence or absence of any other feature.

“A fruit may be considered to be an apple if it is red, round, and about 4″ in


diameter. Even if these features depend on each other or upon the existence
of the other features, a naive Bayes classifier considers all of these properties
to independently contribute to the probability that this fruit is an apple.”
In real datasets, we test a hypothesis given multiple evidence(feature). So,
calculations become complicated. To simplify the work, the feature
independence approach is used to ‘uncouple’ multiple evidence and treat
each as an independent one.

P(H|Multiple Evidences) = P(E1| H)* P(E2|H) ……*P(En|H) * P(H) /


P(Multiple Evidences)
Aravali College of Engineering And Management
Jasana, Tigoan Road, Neharpar, Faridabad, Delhi NCR
Toll Free Number : 91- 8527538785
09/02/2020 Website : www.acem.edu.in 17

You might also like