S6, S7, S8 CS - U4 Getter Setter EM Algorithm
S6, S7, S8 CS - U4 Getter Setter EM Algorithm
Unit -4
S7- Getting values & setting
values
S8- EM algorithm
Getting values & setting values
• the main purpose of using getters and setters in
object-oriented programs is to ensure data
encapsulation.
• Private variables in python are not actually hidden
fields like in other object oriented languages.
2
Getting values & setting values
# Python program showing a use
# of get() and set() method in
# normal function
class User:
def __init__(self, age = 0):
self._age = age
# getter method
def get_age(self):
return self._age
3
# setter method
EM algorithm
4
EM algorithm
5
EM algorithm
• What is an EM algorithm?
• The Expectation-Maximization (EM) algorithm is
defined as
– the combination of various unsupervised machine
learning algorithms
– Used to determine the local maximum likelihood
estimates (MLE) or maximum a posteriori estimates
(MAP) for unobservable variables in statistical models.
• Further, it is a technique to find maximum
likelihood estimation when the latent variables
are present. It is also referred to as the latent
variable model.
6
EM algorithm
• A latent variable model consists of both
observable and unobservable variables where
observable can be predicted while unobserved
are inferred from the observed variable.
• These unobservable variables are known as
latent variables.
7
EM algorithm
• EM algorithm is the combination of various
unsupervised ML algorithms, such as the k-means
clustering algorithm.
• Being an iterative approach, it consists of two
modes.
• In the first mode, we estimate the missing or
latent variables. Hence it is referred to as
the Expectation/estimation step (E-step).
• The other mode is used to optimize the
parameters of the models so that it can explain
the data more clearly. The second mode is known
as the maximization-step or M-step.
8
EM algorithm
Expectation step (E - step): It involves the estimation (guess) of all missing values in
the dataset so that after completing this step, there should not be any missing
value.
Maximization step (M - step): This step involves the use of estimated data in the
E-step and updating the parameters.
Repeat E-step and M-step until the convergence of the values occurs.
EM algorithm
Steps in EM Algorithm
The EM algorithm is completed mainly in 4 steps, which include Initialization Step, Expectation
Step, Maximization Step, and convergence Step. These steps are explained as follows:
EM algorithm
Steps in EM Algorithm
The EM algorithm is completed mainly in 4 steps, which include Initialization Step,
Expectation Step, Maximization Step, and convergence Step. These steps are
explained as follows:
1st Step: The very first step is to initialize the parameter values. Further, the system is
provided with incomplete observed data with the assumption that data is obtained
from a specific model.
2nd Step: This step is known as Expectation or E-Step, which is used to estimate or
guess the values of the missing or incomplete data using the observed data. Further,
E-step primarily updates the variables.
3rd Step: This step is known as Maximization or M-step, where we use complete data
obtained from the 2nd step to update the parameter values. Further, M-step primarily
updates the hypothesis.
4th step: The last step is to check if the values of latent variables are converging or not.
If it gets "yes", then stop the process; else, repeat the process from step 2 until the
convergence occurs.
Gaussian Mixture Model (GMM)
Gaussian Mixture Model (GMM)
Gaussian Mixture Model (GMM)
Gaussian Mixture Model (GMM)
https://www.kaggle.com/code/charel/learn-by-example-expectation-maximization/notebook
Gaussian Mixture Model (GMM)
https://www.kaggle.com/code/charel/learn-by-example-expectation-maximization/notebook
Gaussian Mixture Model (GMM)
The Gaussian Mixture Model or GMM is defined as a
mixture model that has a combination of the unspecified
probability distribution function.