Bayesian Belief Networks (BBNs) are probabilistic graphical models that represent dependencies among attributes, overcoming limitations of Naïve Bayesian classifiers which assume conditional independence. BBNs consist of a directed acyclic graph and conditional probability tables, allowing for the computation of joint probability distributions. Training involves understanding network topology and variables, utilizing methods like gradient descent to optimize conditional probability table entries.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
16 views
Bayesian Belief Networks
Bayesian Belief Networks (BBNs) are probabilistic graphical models that represent dependencies among attributes, overcoming limitations of Naïve Bayesian classifiers which assume conditional independence. BBNs consist of a directed acyclic graph and conditional probability tables, allowing for the computation of joint probability distributions. Training involves understanding network topology and variables, utilizing methods like gradient descent to optimize conditional probability table entries.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 9
Bayesian Belief
Networks
By, Umakanth N Session Outcomes
Train & Use Bayesian belief networks
February 10, 2025 Bayesian Belief Networks 2
Bayesian Belief Networks Probabilistic graphical models, which allows the representation of dependencies among subset of attributes. Limitation of Naïve Bayesian classifier: As it assumes that the attributes are conditionally independent of each other. If this condition holds true, Naïve Bayesian classifier is the best & accurate algorithm among other classifiers. Practically, dependencies will exists between the attributes. Bayesian belief network is also known as Belief networks, Bayesian networks, or Probabilistic networks February 10, 2025 Bayesian Belief Networks 3 Bayesian Belief Networks (Contd.,) It is defined by two components: Directed acyclic graph Each node is the random variable, which can be discrete or continuous-valued Set of conditional probability tables
Training Bayesian Belief Networks Keys to train Bayesian Belief network, Network topology, Network variables (Observable / Hidden) If network topology is known and variable is observable, then training the network can be done by, Computing Conditional Probability Table (CPT) entries by Naïve Bayesian classification. If topology is given and some variables are hidden/missing, Gradient descent method (learning the values of CPT entries) It is used to search for the CPT values Wijk that best February 10, 2025 Bayesian Belief Networks 7 Gradient descent method Performs Greedy-Hill climbing approach. For each iteration, the algorithm moves towards the best solution, without backtracking Steps: Weights Wijk are initialized with random probability values Compute the Gradients
Take small step in the direction of gradients
Renormalize the weights
This learning form is called Adaptive Probabilistic networks February 10, 2025 Bayesian Belief Networks 8 ? ? . . ? ie s e r Q u