0% found this document useful (0 votes)
48 views

AI Assignment 4

1. The document describes using a Bayesian network to calculate the probabilities of having a virus based on the results of two medical tests. It shows that Test B provides a stronger indication of having the virus than Test A. 2. It then provides an example problem of calculating probabilities based on two binary events being independent or not. 3. The remaining sections provide proofs of properties of conditional probability using the Bayesian network framework.

Uploaded by

Tigabu Yaya
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views

AI Assignment 4

1. The document describes using a Bayesian network to calculate the probabilities of having a virus based on the results of two medical tests. It shows that Test B provides a stronger indication of having the virus than Test A. 2. It then provides an example problem of calculating probabilities based on two binary events being independent or not. 3. The remaining sections provide proofs of properties of conditional probability using the Bayesian network framework.

Uploaded by

Tigabu Yaya
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

NISS, Computer Engineering

IT00CH91-3003 Artificial Intelligence

Assignment 4
on
Exercise Set 4

By: Tigabu Yaya Gishene


[email protected]

Submitted to: Prof. Johan Lilius


Submission Date: Dec 10th, 2023
1.
Bayesian Network:

virus

TestA TestB

We need to determine the probabilities of having the virus based on the results of two tests.
"CarryVirus" indicates the presence of the virus. Therefore, we aim to calculate
P(CarryVirus|TestA) and P(CarryVirus|TestB).

𝑃(𝑇𝑒𝑠𝑡𝐴|𝐶𝑎𝑟𝑟𝑎𝑦𝑉𝑖𝑟𝑢𝑠)𝑃(𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠)
𝑃(𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠|𝑇𝑒𝑠𝑡𝐴) =
𝑃(𝑇𝑒𝑠𝑡𝐴)

𝑃(𝑇𝑒𝑠𝑡𝐴) = 𝑃(𝑇𝑒𝑠𝑡𝐴|𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠) + 𝑃(𝑇𝑒𝑠𝑡𝐴|¬𝐶𝐴𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠)


= 0.95 ∗ 0.01 + 0.10 ∗ 0.99
= 0.1085

𝑃(𝑇𝑒𝑠𝑡𝐴|𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠)𝑃(𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠) 0.95∗0.01
𝑃(𝑇𝑒𝑠𝑡𝐴)
= 0.1085
≈ 𝟎. 𝟎𝟖𝟕𝟔

𝑃(𝑇𝑒𝑠𝑡𝐵|𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠)𝑃(𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠)
𝑃(𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠|𝑇𝑒𝑠𝑡𝐵) = 𝑃(𝑇𝑒𝑠𝑡𝐵)

𝑃(𝑇𝑒𝑠𝑡𝐵) = 𝑃(𝑇𝑒𝑠𝑡𝐵|𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠) + 𝑃(𝑇𝑒𝑠𝑡𝐵|¬𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠)


= 0.90 ∗ 0.01 + 0.05 ∗ 0.99
= 0.0585

𝑃(𝑇𝑒𝑠𝑡𝐵|𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠)𝑃(𝐶𝑎𝑟𝑟𝑦𝑉𝑖𝑟𝑢𝑠) 0.90∗0.01
𝑃(𝑇𝑒𝑠𝑡𝐵)
= 0.0585
≈ 𝟎. 𝟏𝟓𝟑𝟖

Hence:
𝑷(𝑪𝒂𝒓𝒓𝒚𝑽𝒊𝒓𝒖𝒔|𝑻𝒆𝒔𝒕𝑩) > 𝑷(𝑪𝒂𝒓𝒓𝒚𝑽𝒊𝒓𝒖𝒔|𝑻𝒆𝒔𝒕𝑨)
- Test B provides a stronger indication of a person having the virus.
2.
One possible way to solve:
𝑃(𝐴 = 𝑇) = 0.6
𝑃(𝐵 = 𝑇) = 0.15 + 𝑥
𝑃(𝐴 = 𝑇)𝑃(𝐵 = 𝑇) = 𝑃(𝐴 = 𝑇, 𝐵 = 𝑇) independence
0.6(0.15 + 𝑥) = 0.15
0.15
𝑥= 0.6
− 0.15 = 0.1
𝑦 = 0.4 − 𝑥
𝑦 = 0.4 − 0.1 = 0.3
3.
Proof:
𝑃(𝐴,𝐵,𝑒)
𝑃(𝐴, 𝐵|𝑒) = 𝑃(𝑒)
by definition of conditional probability
𝑃(𝐴|𝐵, 𝑒 )𝑃(𝐵,𝑒)
= (chain rule)
𝑃(𝑒)
= 𝑃(𝐴|𝐵, 𝑒)𝑃(𝐵|𝑒) (definition of conditional Probability)
4.
Proof: Assume 𝑃(𝐴|𝐵) = 𝑃(𝐴)
𝑃(¬𝐴|𝐵) = 1 − 𝑃(𝐴|𝐵) (probabilities sum to 1)
= 1 − 𝑃(𝐴) (by assumption)
= 𝑃(¬𝐴)
5. The process of inference by enumeration involves summing up the joint probabilities of
individual events, computed from a network model. The probability of each event is
calculated based on its conditional probability given its parents: 𝑃𝑟(𝑃1 , . . . , 𝑃𝑛 ) =
𝑃𝑟(𝑃1 |𝑝𝑎𝑟𝑒𝑛𝑡𝑠(𝑃1 )) 𝑥 … 𝑥 𝑃𝑟(𝑃𝑛 |𝑝𝑎𝑟𝑒𝑛𝑡𝑠(𝑃𝑛 )). This method does not exploit conditional
independence to simplify inference further. While it is a straightforward and easily formalized
algorithm, it comes with a high computational cost. Its complexity grows exponentially with
the number of variables.
Pr(¬p3 ) = Σ𝑃1 ,𝑃2 ,𝑃3 𝑃𝑟(𝑃1 , 𝑃2 , 𝑝3 , 𝑃4 )
= Σ𝑃1 ,𝑃2 ,𝑃4 Pr (P1 )Pr (P2 |P1 )Pr (−p3 |P2 )Pr(P4 |P2 )
= Pr(P1 )Pr(P2 |P1 )Pr(¬p3 |P2 )Pr(P4 |P2 ) +
Pr(P1 )Pr(P2 |P1)Pr(¬p3|P2 )Pr(¬𝑃4|P2 ) +
Pr(P1 )Pr(¬P2 |P1 )Pr(¬p3 |¬P2)Pr(𝑃4 |¬P2 ) +
Pr(p1 )Pr(¬P2|P1 )Pr(¬p3 |¬P2 )Pr(¬P4|¬P2 ) +
Pr(¬P1 )Pr(P2 |¬P1)Pr(¬p3|P2 )Pr(P4 |P2 ) +
Pr(¬P1 )Pr(P2 |¬P1)Pr(¬p3 |P2 )Pr(¬P4 |P2 ) +
Pr(¬P1 )Pr(¬P2|¬P1 )Pr(¬p3 |¬P2)P r(P4 |¬P2) +
Pr(¬P1 )Pr(¬P2|¬P1 )Pr(¬p3 |¬P2)P r(¬P4 |¬P2 )
= .4 × .8 × .8 × .8 +
.4 × .8 × .8 × .2 +
.4 × .2 × .7 × .5 +
.4 × .2 × .7 × .5 +
. 6 × .5 × .8 × .8 +
. 6 × .5 × .8 × .2 +
. 6 × .5 × .7 × .5 +
. 6 × .5 × .7 × .5
= .2048 + .0512 + .028 + .028 +
.192 + .048 + .105 + .105
= . 𝟕𝟔𝟐

Pr(P2 ,¬p3 ) .496


Pr(𝑃2 |¬𝑝3 ) = Pr (¬p3 )
= .762 = . 𝟔𝟓𝟎𝟗
Pr(𝑃2 |¬𝑝3 ) = Σ𝑃1 , 𝑃4 𝑃𝑟(𝑃1 , 𝑃2 , ¬𝑝3 , 𝑃4 )
= Σ𝑃1 ,𝑃4 Pr (P1 )Pr (P2 |P1 )Pr (−p3 |P2 )Pr(P4 |P2 )

= Pr(P1 )Pr(P2 |P1 )Pr(¬p3 |P2 )Pr(P4 |P2 ) +


Pr(P1 )Pr(P2 |P1 )Pr(¬p3 |P2 )Pr(¬𝑃4 |P2 ) +
Pr(¬P1 )Pr(P2 |¬P1)Pr(¬p3|P2 )Pr(P4 |P2 ) +
Pr(¬P1 )Pr(P2 |¬P1)Pr(¬p3 |P2 )Pr(¬P4 |P2 )
= .4 × .8 × .8 × .8 +
.4 × .8 × .8 × .2 +
.6 × .5 × .8 × .8 +
.6 × .5 × .8 × .2
= .2048 + .0512 + .192 + .048
=. 𝟒𝟗𝟔
Pr(𝑃1 ,P2 ,¬p3 ) .256
Pr(𝑃1 |𝑃2 , ¬𝑝3 ) = = = . 𝟓𝟏𝟔𝟏
Pr (P2 ,¬p3 ) .496

Pr (𝑃1 , 𝑃2 , ¬𝑝3 ) = Σ𝑃4 𝑃𝑟(𝑃1 , 𝑃2 , ¬𝑝3 , 𝑃4 )

= Σ𝑃4 Pr (P1 )Pr (P2 |P1 )Pr (−p3 |P2 )Pr(P4 |P2 )

= Pr(P1 )Pr(P2 |P1 )Pr(¬p3 |P2 )Pr(P4 |P2 ) +


Pr(P1 )Pr(P2 |P1 )Pr(¬p3 |P2 )Pr(¬𝑃4 |P2 )

= .4 × .8 × .8 × .8 +
.4 × .8 × .8 × .2

= .2048 + .0512

= . 𝟐𝟓𝟔
Pr(P2 , ¬p3 ) = Pr(P1 , P2 , ¬p3 ) + Pr(¬P1 , P2 , ¬p3 )
= .256 + .24
= . 𝟒𝟗𝟔

Pr (¬𝑃1 , 𝑃2 , ¬𝑝3 ) = Σ𝑃4 𝑃𝑟(¬𝑃1 , 𝑃2 , ¬𝑝3 , 𝑃4 )

= Σ𝑃4 Pr (¬P1 )Pr (P2 |¬P1 )Pr (−p3 |P2 )Pr(P4 |P2 )

= Pr(¬P1 )Pr(P2 |¬P1)Pr(¬p3|P2 )Pr(P4 |P2 ) +


Pr(¬P1 )Pr(P2 |¬P1 )Pr(¬p3|P2 )Pr(¬P4|P2 )
= .6 × .5 × .8 × .8 +
. 6 × .5 × .7 × .2

= .192 + .048

= . 𝟐𝟒

Pr(𝑃1 ,¬p3 ,P4 ) .2328


Pr (𝑃1 |¬𝑝3 , 𝑃4 ) = Pr (¬p3 ,P4 )
= .5298 = . 𝟒𝟑𝟗𝟒
Pr (𝑃1 , ¬𝑝3 , 𝑃4 ) = Σ𝑃2 𝑃𝑟(𝑃1 , 𝑃2 , ¬𝑝3 , 𝑃4 )

= Σ𝑃2 Pr (P1 )Pr (P2 |P1 )Pr (−p3 |P2 )Pr(P4 |P2 )

= Pr(P1 )Pr(P2 |P1 )Pr(¬p3|P2 )Pr(P4 |P2 ) +


Pr(P1 )Pr(¬P2 |P1 )Pr(¬p3 |¬P2)Pr(𝑃4 |¬P2 )

= . 4 × .8 × .8 × .8 +
. 4 × .2 × .7 × .5

= .2048 + .028

= . 𝟐𝟑𝟐𝟖

Pr(¬p3 , 𝑃4 ) = Pr(P1 , ¬p3 , P4 ) + Pr(¬P1 , ¬p3 , P4 )


= .2328 + .297
= . 𝟓𝟐𝟗𝟖

Pr(¬𝑃1, ¬𝑝3, 𝑃4) = Σ𝑃2 𝑃𝑟(¬𝑃1 , 𝑃2 , ¬𝑝3 , 𝑃4 )

= Σ𝑃2 Pr (¬P1 )Pr (P2 |¬P1 )Pr (−p3 |P2 )Pr(P4 |P2 )

= Pr(¬P1 )Pr(P2 |¬P1)Pr(¬p3 |P2 )Pr(P4 |P2 ) +


Pr(¬P1 )Pr(¬P2 |¬P1)Pr(¬p3 |¬P2 )P r(P4 |¬P2 )

= .6 × .5 × .8 × .8 +
.6 × .5 × .7 × .5

= .192 + .105

= . 𝟐𝟗𝟕
Hence:
a. 𝐏𝐫(¬𝐩𝟑 ) = 𝟎. 𝟕𝟔𝟐 𝐛. 𝐏𝐫(𝐏𝟐 |¬𝐩𝟑 ) = 𝟎. 𝟔𝟓𝟎𝟗,
𝒄. 𝐏𝐫(𝐏𝟏 |𝐏𝟐 , ¬𝐩𝟑 ) = 𝟎. 𝟓𝟏𝟔𝟏 𝐝. 𝐏𝐫(𝐏𝟏 |¬𝐩𝟑 , 𝐏𝟒 ) = 𝟎. 𝟒𝟑𝟗𝟒.

You might also like