0% found this document useful (0 votes)
151 views

Machine Learning PPT (Alisha)

The document discusses Naive Bayes algorithms and Gibbs sampling. It explains that Naive Bayes is a supervised learning algorithm used for classification based on Bayes' theorem. It predicts the probability of an object based on feature values. Some examples given are spam filtering and sentiment analysis. Gibbs sampling is introduced as a technique to approximate joint distributions when directly sampling is difficult, by instead sampling from conditional distributions in a Markov chain.

Uploaded by

Sapna Shelar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
151 views

Machine Learning PPT (Alisha)

The document discusses Naive Bayes algorithms and Gibbs sampling. It explains that Naive Bayes is a supervised learning algorithm used for classification based on Bayes' theorem. It predicts the probability of an object based on feature values. Some examples given are spam filtering and sentiment analysis. Gibbs sampling is introduced as a technique to approximate joint distributions when directly sampling is difficult, by instead sampling from conditional distributions in a Markov chain.

Uploaded by

Sapna Shelar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Naïve Bayes &Gibbs Algorithm

Presented By:-
Alisha Inamdar
What is Naive Bayes Algorithm

 Naive Bayes algorithm is a supervised learning algorithm, which is based


on Bayes theorem and used for solving classification problems.
 It is mainly used in text classification that includes a high-dimensional training
dataset.
 Naive Bayes Classifier is one of the simple and most effective Classification
algorithms which helps in building the fast machine learning models that can
make quick predictions.
 It is a probabilistic classifier, which means it predicts on the basis of the
probability of an object.
 Some popular examples of Naive Bayes Algorithm are spam filtration,
Sentimental analysis, and classifying articles.
The Mathematics of the Naive Bayes Algorithm

 Bayes’ Theorem finds the probability of an event occurring given the probability of another event
that has already occurred. Bayes’ theorem is stated mathematically as the following equation:

 where A and B are events and P(B) ≠ 0.


 Basically, we are trying to find probability of event A, given the event B is true. Event B is also
termed as evidence.
 P(A) is the priori of A (the prior probability, i.e. Probability of event before evidence is seen). The
evidence is an attribute value of an unknown instance(here, it is event B).
 P(A|B) is a posteriori probability of B, i.e. probability of event after evidence is seen.
Where is Naive Bayes Used?

You can use Naive Bayes for the following things:


 Face Recognition:
As a classifier, it is used to identify the faces or its other features, like nose, mouth, eyes, etc.
 Weather Prediction :
It can be used to predict if the weather will be good or bad.
 Medical Diagnosis: 
Doctors can diagnose patients by using the information that the classifier provides. Healthcare
professionals can use Naive Bayes to indicate if a patient is at high risk for certain diseases and
conditions, such as heart disease, cancer, and other ailments
Example
 Shopping Example 
 Problem statement: To predict whether a person will purchase a product on a specific
combination of day, discount, and free delivery using a Naive Bayes classifier. 
Gibbs Sampling: Basic Idea

  If the joint distribution is not known explicitly or is difficult to sample from directly, but
the conditional distribution is known or easy to sample from.
 Even if the joint distribution is known, the computational burden needed to calculate it
may be huge.
 Gibbs Sampling algorithm could generate a sequence of samples from conditional
individual distributions, which constitutes a Markov chain, to approximate the joint
distribution.

You might also like