0% found this document useful (0 votes)
70 views

Artificial Intelligence

The document discusses various informed search algorithms used in artificial intelligence including greedy best-first search, A* search, hill climbing, simulated annealing, local beam search, genetic algorithms, and adversarial search algorithms like minimax and alpha-beta pruning. It provides descriptions of each algorithm, including how they work, examples of their use, and comparisons of their advantages and disadvantages. The search algorithms are used to efficiently find optimal or near-optimal solutions to problems by evaluating states based on heuristics.

Uploaded by

Ali Raza
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views

Artificial Intelligence

The document discusses various informed search algorithms used in artificial intelligence including greedy best-first search, A* search, hill climbing, simulated annealing, local beam search, genetic algorithms, and adversarial search algorithms like minimax and alpha-beta pruning. It provides descriptions of each algorithm, including how they work, examples of their use, and comparisons of their advantages and disadvantages. The search algorithms are used to efficiently find optimal or near-optimal solutions to problems by evaluating states based on heuristics.

Uploaded by

Ali Raza
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 195

Artificial Intelligence

Informed Searches

Dr. Muhammad Awais


Outline
• Greedy best first search
• A* Search
• Hill Climbing Search
• Simulated annealing
• Local beam search
• Genetic algorithm (GA)
• Minmax algorithm (adversarial search)
• Alpha-beta pruning
Greedy best first search
• Expand the node that is closet to the goal
• (Closest to goal) state that is likely to
lead to goal
• Node evaluation by heuristic function
• f(n) = h(n)
• h(n) = Estimated cost of the cheapest
path from node n to goal node
o Straight line distance heuristic
(shortest path heuristic)
o Good degree (job heuristic)
o Paddle press speed heuristic
Greedy best first search
• h(n) takes node as input
• h(n) only considers the state concerning
the currently considered node
• h(n) = Straight line distance = hSLD
• hSLD can not be computed from problem
description (Informed Search)
• Greedy best first search resembles DFS in
complexity, i.e. O(bm)
• Disadvantage of Greedy best first search
(DFS)
• Not optimal
• Incomplete
• Greedy best first search
Arad to Bucharest (Greedy best first search)
Arad to Bucharest (Greedy best first search)
Arad to Bucharest (Greedy best first search)

418
450
A* Search
• Node evaluation by the combination of two cost
functions, i.e.
• f(n) = g(n) + h(n)
• g(n) : Cost to reach the node n from start node
• h(n) : Cost to reach the goal from node n
• A* is optimal given h(n) is admissible heuristic (AH)
• AH never overestimates the cost to reach the
goal
• Heuristic h(n) is consistent
• If h(n) ≤ C (n, a, n’) + h(n’)
• C(n, a, n’) : Step cost from n to n’ by action a
• h(n’) : Estimated cost from n’ to goal node
A* Search
A* Search
A* Search
Hill Climbing Search
• Search algorithm loop continuously in the
increasing direction value
• Terminates when reaches Peak
• No neighbors have higher value
• No tree structure is maintained (used)
• Stochastic hill climbing
• Random selection of uphill moves (neighbors)
• Selection probability can vary w.r.t uphill
steepness
• First-choice hill climbing
• Random generation of successor states
• Generation until a better successor state is
found
Hill Climbing Search
Simulated annealing
• Heat treatment that alters a material to increase its ductility
and to make it more workable
• Maximization of objective function
• A move is always accepted if the situation is improved
• Otherwise the acceptance probability is less than 1
• Acceptance probability (otherwise case) decreases w.r.t
move badness
• ΔE = Value [next] – Value[current]
• ΔE corresponds to the amount of badness of a move
• Acceptance probability also decrease w.r.t. to T
• T goes down
• Bad move may be allowed initially (high T)
• Bad move permission is unlikely at end (low T)
• At high temperatures, explore parameter space
• At lower temperatures, restrict exploration
• Slow change (schedule) in T helps find global optimum with
probability approaching to 1
Simulated annealing
Local beam Search
1. Begins with K randomly generated states
2. At each step successor of K states are
generated
3. If goal is found then algorithm halts
4. Otherwise k best states are selected from the
successors
5. Algorithm is repeated from step 2
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Local beam Search (K=2)
Start
A
20 18

B C
16 15 17 13

D E F G

9 12
10 11

H I J K
8 2

L M Goal
Genetic algorithm (GA)
• Successor states are generated by combining two
parent states
• GA begins with a set of k randomly generated
states (population)
• A state is represented by a string of finite
alphabet
• Each state is rated by an evaluation function
(Fitness function)
• The states are randomly selected, randomness in
accordance with fitness function.
• For mating between two states crossover point is
randomly chosen
• Each location is subject to random mutation with
an independent probability
Genetic algorithm (GA)
Genetic algorithm (GA)

• 24, 23, 20 and 11 are the fitness values calculated by a fitness


function for the respective population candidate
• Similarly 31%, 29%, 26% and 14% is the selection probability for the
respective population candidate, i.e.,
• f(x1,2,3,4) = 24,23,20,11
• Σ f(x1,2,3,4) = 24+23+20+11 = 78
• f(xi) / Σ f(x1,2,3,4) =
• 24 /78 = 0.31, 23/78 = 0.23, 20/78 = 0.26, 11/78 = 0.14
Minmax Algorithm
• Adversarial search algorithm
• Minmax algorithm operates in a DFS fashion
• Minmax value of a node is the utility of being in
the corresponding state
• Assumption : Both players paly optimally
• Max will prefer to move to a maximum value state
• Min will prefer to move to a minimum value state
Minmax Algorithm
Minmax Algorithm
S0
Max

S01 S02
Min S03

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
Minmax Algorithm
S0

S0 S02 S03
1
3

S011 S012 S013 S02 S022 S023 S031 S032 S033


1
3 12 8 2 4 6 14 5 2
Minmax Algorithm
S0

S0 S02
1
3 2 S03

S011 S012 S013 S02 S022 S023 S031 S032 S033


1
3 12 8 2 4 6 14 5 2
Minmax Algorithm
S0

S0 S02
1
3 2 2 S03

S011 S012 S013 S02 S022 S023 S031 S032 S033


1
3 12 8 2 4 6 14 5 2
Minmax Algorithm
S0
3

S0 S02
1
3 2 2 S03

S011 S012 S013 S02 S022 S023 S031 S032 S033


1
3 12 8 2 4 6 14 5 2
Minmax Algorithm
S0

S01

S011 S012 S013


1- V  Max-value(S0)
3 12 8
Minmax Algorithm
S0

S01

S011 S012 S013

1- V  Max-value(S0)
3 12 8
2- V = -∞
3- V  Max( V = -∞ ,Min-value(S01))
Minmax Algorithm
S0

1- V  Max-value(S0)
S01
2- V = -∞
3- V  Max( V = -∞ ,Min-value(S01))
4- V = ∞
5- V  Min( V = ∞ ,Max-value(S011))
7- V  Min( V = ∞ , 3)
8- V  Min( V = 3 ,12)
S011 S012 S013 9- V  Min( V = 3, 8)

3 12 8
Minmax Algorithm
S0

1- V  Max-value(S0)
S01 3
2- V = -∞
3- V  Max( V = -∞ ,Min-value(S01))
4- V = ∞
5- V  Min( V = ∞ ,Max-value(S011))
7- V  Min( V = ∞ ,3)
8- V  Min( V = 3 ,12)
S011 S012 S013 9- V  Min( V = 3, 8)
10- return V = 3 after 3rd time calling of
3 12 8 Max-Value in Min function
Minmax Algorithm
S0

S01
S02
3

11- V  Max( V = 3 ,Min-value(S02))

S021 S022 S023

2 4 6
Minmax Algorithm
S0

S01
S02
3 2 12- V = ∞
13- V  Min( V = ∞ ,Max-value(S021))
14- V  Min( V = ∞, 2)
15- V  Min( V = 2 , 4)
16- V  Min( V = 2, 6)
17- return V = 2 at step 3
S021 S022 S023

2 4 6
Minmax Algorithm

S0
S01
18- V  Max( V = 3 ,Min-value(S03))
3
19- V = ∞
2 2
20- V  Min( V = ∞ , Max-value(S031)) S02 S03
21- V  Min( V = ∞, 14)
22- V  Min( V = 14 , 5)
23- V  Min( V = 5, 2)
24- return V = 2 at step 3

S031 S032 S033

14 5 2
Minmax Algorithm
S0
3

S0 S02
1
3 2 2 S03

S011 S012 S013 S02 S022 S023 S031 S032 S033


1
3 12 8 2 4 6 14 5 2

3
AlphaBeta pruning Algorithm
MinMax problem :
1. Has to visit exhausts the state space
2. States to be examined are exponential with respect to the moves
3. Exponential expected states to be visited can not be removed

Alpha Beta pruning :


1. Cut the exponential factor to possible smallest value
2. αß applied to minmax provide possibly effecient result w.r.t minmax
3. αß prunes the minmax tree-branches that cannot possibly influence
the final decision
AlphaBeta pruning Algorithm
AlphaBeta pruning Algorithm

At a max node:
 = largest child utility found so far
 =  of parent
At a min node:
 =  of parent
 = smallest child utility found so far
AlphaBeta pruning Algorithm Example
α=-∞ S0
β=+∞

S01 S02
S03

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
AlphaBeta pruning Algorithm Example
α=-∞ S0
β=+∞

α=-∞ S S02
01 S03
β=+∞

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
AlphaBeta pruning Algorithm Example
α=-∞ S0
β=+∞

α=-∞ S S02
01 S03
β=+∞

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=3
AlphaBeta pruning Algorithm Example
α=-∞ S0
β=+∞

α=-∞ S01 S02


S03
β=3

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=12
AlphaBeta pruning Algorithm Example
α=-∞ S0
β=+∞

α=-∞ S01 S02


S03
β=3

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=8
AlphaBeta pruning Algorithm Example
α=3 S0
β=+∞

α=-∞ S01 S02


S03
β=3

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=8
AlphaBeta pruning Algorithm Example
α=3 S0
β=+∞

α=-∞ S01 α=3 S02 S03


β=3
β=+∞

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=2
AlphaBeta pruning Algorithm Example
α=3 S0
β=+∞

α=-∞ S01 α=3 S02 S03


β=3
β=2

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=2
AlphaBeta pruning Algorithm Example
α=3 S0
β=+∞

α=-∞ S01 α=3 S02 S03


β=3
β=2

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=2
AlphaBeta pruning Algorithm Example
α=3 S0
β=+∞

α=-∞ S01 α=3 S02 α=3 S03


β=3
β=2 β=14

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=14
AlphaBeta pruning Algorithm Example
α=3 S0
β=+∞

α=-∞ S01 α=3 S02 α=3 S03


β=3
β=2 β=5

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=5
AlphaBeta pruning Algorithm Example
α=3 S0
β=+∞

α=-∞ S01 α=3 S02 α=3 S03


β=3
β=2 β=2

S011 S012 S013 S021 S022 S023 S031 S032 S033

3 12 8 2 4 6 14 5 2
β=2
Artificial Intelligence

Informed Searches

Dr. Muhammad Awais


CSP definition
•Set of variables : {x1,x2,x3,…xn}

•Set of constraints : {c1,c2,c3,…cn}

•Domain of a variable i is not empty : Di ≠ {}

•Combination of specific values according to constraint var c1,

c2, … i.e., {{x11,x23,x35}, {x41,x15}, etc.}

•CSP state : Assignment of values to some or all variables


•Legal / Consistent assignment : Assignment that does not
violate any constraint

•CSP solution : A complete assignment satisfying the


constraints
CSP Example (map coloring)
CSP Example
•Constraint : Neighboring regions should have distinct colors
CSP Formulation:
•WA, NT, Q, NSW, V, SA, and T
•Domain of all variables = {red, green, blue}
•Constrain C for WA and NT, and the allowable combinations
•{(red, green), (red, blue), (green, red), (green, blue), (blue, red),
(blue, green)}
•C = col(WA) ≠ col(NT)
Visualization of a CSP as Constraint graph
•Nodes correspond to variables of CSP
•Arc correspond to the constraints
CSP as a search problem
•Initial state : Empty assignment, all variables are unassigned
•Successor Function : Assignment of values without
conflicting previously assigned values (according to condition)
•Goal test : assignment is complete
•Path test : Constant cost
Example CSPs
•Finite Domain:
• Variables : Discrete, Finite domains
• Examples: Coloring problem, 8-Queen problem
• VAR : Q1,…,Q8, Q1:{1,…,8}
•Boolean CSPs:
• Variables can be true/false
•Infinite Domain CSPs
• Set of integers/strings
•Constraint language
• If enumeration of all possible combination of values then constraints
language is used startJob1 + 5 ≤ startJob2
•Linear Constraints
• Example : startJob1 + 5 ≤ startJob2
•Non Linear Constraint:
• Non linear functions, etc
•Continuous Domain CSP:
•Unary constraint CSP:
•Binary CSP:
•Linear programming:
• Continuous domain CSP
• Constraint : linear inequalities forming convex region
Back Tracking Search CSP
•Operates in Depth first search fashions
•Choose value for one variable at a time
•Back track when a variable has no legal value left
•One at time incremental successor generation
•Extends current assignment to generate a successor
value assignment
•Back tracking is like an informed search
Back tracking search CSP
Back tracking search CSP
Informed CSP search solution
•Which variable should be assigned next
•What order of values of a variable should be used
•Implementation of current variable assignment w.r.t
unassigned variables
•In case of fail path, avoid repeating this failure in
subsequent path
• reached state, a variable has no legal value
Variable and value ordering
•MRV (Minimum value heuristic)
•Degree heuristic
•Least constraining value
MRV
•BT simply selects next unassigned variable for value assignment
•no intelligence involved ordering is seldom efficient
• Example WA = red & NT = green
•  SA can only be value
•  WA = red & NT = green & SA = blue
•  constraint next assignment
•MRV : Choosing a variable (most constrained variable) with fewest legal
values, (fail first heuristic)
• If X remains with zero legal values then MRV will select X
Degree heuristic
•Suggest to choose the first variable to assign the value
• Choose variable involved in largest no of Constraints
• E.g. SA, Degree(SA) = 5
• Attempts to reduce branching factor in future
• Root node will have highest no of Childs in tree fashion Search
• No Successor has further branches or simply lesser branches
• E.g., Starting with SA Choosing the color for the region
• WA, NT, Q, NSW, V will have no branch at all
Least constraining value
•Assignment of values (order) to the selected variables
•Prefers values for a variable
• rules out fewest values for neighboring values
• WA = red, NT = green
• Q = blue  bad , eliminates last legal value for SA
Forward checking (FC)
•If X is assigned a value then delete inconsistent values from the domain of
each Unassigned variable Y connected to X by a constraint
•Forward Checking efficiently compute information required by MRV to
perform
Constraint propagation
•Constraint implication propagation of a variable onto other variable (FC)
•Problem with FC
•FC does not consider the inconsistency w.r.t adjacent locations
•Both NT & SA = blue
•But NT and SA are adjacent  inconsistency
Arc Consistency
• Arc refer to a directed arc in the constraint graph
• Domain of SA = Do (SA)
• Domain of NSW = Do (NSW)
• arc from SA  NSW is consistent
• if for x Do
 (SA) there is y Do(NSW)

•Consistency is achieved by deleting inconsistent y 
Do(NSW) w.r.t x Do 
(SA)
• Example Do(SA) = {blue} & Do(NSW) = {red, blue}
• For SA = blue
• Arc SA : SA  NSW is consistent
• Do(NSW) = {red}
• Arc NSW : NSW  SA is inconsistent
• For NSW = blue, Do(SA) = {}
Arc consistency is a repeated process
•Making an arc consistent (deleting a value) may make previously existing arc
inconsistent
•AC-3 uses a Queue for checking inconsistency
•Consider an arc (Xi, Yj)

• Xi and Yj are CSP variables or nodes of CSP graph

• X ϵ D (Xi) is deleted if

• There is no Y ϵ D (Yj), such that (X, Y) satisfies constraint b/w X & Y

•The arc/s (Xk, Xi) pointing to Xi are reinserted on Queue


Xk

Xi

Xj
Arc consistency
Artificial Intelligence

Search based Solution

Dr. Muhammad Awais


Outline
• Problem Solving agents
• Example problems
• Solution search
• Uniformed search
• Partial information search
• Summary
Problem solving agents
• Problem solving agents
• Goal Formulation
• Problem Formulation (selection of actions & states concerning a specific problem)
• Search (action sequence)
Search-
Problem Solution
Algorithm
• Agent Design assumptions
• Static
• Observable
• Discrete
• Deterministic
• open loop
Problem solving agents
Path search example

Note : State description is abstract as compared to the real world states


Problem & Solution definition
• Initial state
• e.g In (Arad)
• Successor function Ψ
• Σ (generated nodes) = Ψ(x)
• Σ = {(ai, si)}
• e.g. {<go(Sibiu), In(Sibiu)>,
<go(Timisoura), In(Timisoura)>
<go(zierand), In(zierand)>}
• Goal testing
• e.g. Chess goal test, In(Bucharast)
• Path cost
• numeric cost
• cost function (reflecting performance measure)
• path cost correspond to all the steps cost
• step cost c(x, a, y)
• simplified map due to abstraction
Example problem
• Toy problem
• Vacuum cleaner problem
• 8-puzzle problem
• 8-queen problem, etc.
• Real world problem
• Route finding problem
• VLSI layout
• Robot navigation
• Automatic assembly
Vacuum cleaner problem
• States :
• Two states
• Agent can be in any one of two states
• Any of state may or may not contain dirt
• State space 23 = 8
• Initial state:
• Any of the available states in the vacuum cleaner
example
• Successor function
• Generate next legal state taking actions as input (left,
right and suck)
• Goal test:
• Whether all the squares are clean
• Path cost:
• uniform cost (each step cost 1)
Vacuum cleaner problem
• Move left : If (right) then move left else do nothing
• Move right : If (left) then move right else do nothing
• Suck : if (dirty) then suck else do nothing
8-puzzle problem
• 8-puzzle problem sliding- block puzzles
• Block sliding into required configuration
• Sliding-block class is NP-complete
• 9!/2 = 181,440 reachable states
8-puzzle problem
• States :
• Location of each number and blank space at ti
• Initial state:
• Any configuration of the numbers and blank space
• Successor function
• Generate next legal by actions (blank move left, right, up
and down)
• Goal test:
• Whether the required configuration is reached
• Path cost:
• uniform cost (each step cost 1)
Real world problem
• Traveling Salesperson problem
• Visit each city exactly once
• Shortest path for the tour
• VLSI layout
• Planning millions of component and connections in
minimum place
• Robot navigation
• Route finding problem
• 2 dimensional for a simple robot
• > 2 dimensional for complex robot (legs, arms)
• Automatic assembly
• Task sequence search
• Internet searching
• Conceptualization of internet as a graph
• Finding the answer by traversing the nodes optimally
Solution finding using search
• Search tree
• Generated by initial state (I) and successor function (S)
• Search space defined by IS
• Search node (SN) (tree root)
• State
• Configuration of the world, e.g.,
• People, traffic, and other objects on the street
• People and objects in the office, home and workplace, etc.
• Node
• Data structure for tree representation
• Node correspond to a state
• Different nodes can correspond to the same state
• Parent Node (PN): Generator of the Current Node (CN)
• Action: CN = Action(PN)
• Path Cost: Cost to reach CN
• Depth: no of steps from SN to CN
Solution search
• Node Expansion
• Σ = Ψ(x)
• State expansion depends on search strategy (Ψ)
Ara
d
Zerin
Sibiu Timisoara
d
Fagara Orade Reminica Lugo Orade
Arad s a Vikea Arad j
Arad a
Ara
d
Zerin
Sibiu Timisoara
d
Reminic
Fagara Orade Lugo Orade
Arad s a
a Arad j
Arad a
Vikea
Ara
d
Zerin
Sibiu Timisoara
d
Reminic
Fagara Orade Lugo Orade
Arad s a
a Arad j Arad a
Vikea
Informal Search algorithm (tree based)
Function : Tree-Search
Input : Problem, strategy
Output : Solution, Failure
Initialize: Select (Search-Node)
Procedure :
loop
if (fringe(empty)) then
return Failure
else
Node = get_Node(fringe)
If (Goal_Node == Node) then
return Solution(Goal_Node)
else
fringe = expand(strategy, Node)

fringe : Fringe is a collection containing the generated but not expanded Nodes
Breadth-first search (BFS)
1. Tree search implementation
2. FIFO input to the fringe in Tree search
3. Newly generated nodes by Successor Function are placed at
the end of fringe
4. Complete if the shallowest goal at d depth
5. Time complexity is O(bd+1)
a. d = depth
b. b = branching factor
6. Space complexity
a. b1+b2+b3+…+bd+bd+1 = O(bd+1)
Breadth-first search (BFS)
Depth-first search (DFS)
• Tree search implementation
• LIFO input to the fringe in the Tree search
• Newly generated nodes by Successor Function are
placed at the start of fringe
• Incomplete if a node at depth d is at infinite length
• Low memory space requirement
• The expanded leaf nodes are dropped from fringe
• Store single path from the root node to the leaf
node along with unexpanded sibling
• O(bm+1) Branching factor b, depth m
• Time complexity O(bm)
Depth-first search (DFS)
Back tracking (variant DFS)
• Only one successor is generated instead of all
successors
• Partially expanded node remember which
successor to generate next
• Space requirement O(m) instead of O(bm)
• Depth limit search perform DFS with a predefined
depth limit l
Iterative deepening DFS
• Iterative deepening depth-first search
• Variation of DFS
• Depth limit increase gradually if goal node is not
found, i.e., 0, 1, 2, d.
• Memory requirement O(bd)
• Regeneration of multiple states (from root node)
• Leaf nodes are only created once
• Generated nodes
o d(b)1 + (d-1)b2+…+(1)bd

• Time complexity O(bd)


Iterative deepening DFS
Bidirectional search
• Begin with start and goal node
• Proceed in opposite direction towards each other
• If a node in the search tree of one search (start to
goal/goal to start) exists in other nodes fringe then
solution is considered to be found
• Time and space complexity is O(bd/2)
Sensorless search
• Agent has no sensors
• Agent may exist in any of available state
• Action leads to next legal state
• Vacuum cleaner agent
• Action : move left, move right, suck
Sensorless vacuum cleaner
• Search in space of belief state than in physical
state
• Environment is deterministic, i.e., action leads to a
known state
• Initial state : {1,…8}
• Action (move right) : {2,4,6,8}
• Action (suck) : {4,8}
• Action (move left) : {1, 3, 5, 7}
• Action (suck) : {7} Goal state
Sensorless vacuum cleaner
Contingency problem
• In a Partially observable environment OR uncertain
actions
• Percepts provide new information after each action
• Vacuum cleaner example
• Available sensors
• Position
• Local dirt
Percept Action
Left, Dirty {Suck, Right, Suck}
{1,3}
• Suck  {5, 7}  Right  {6, 8}  Suck  {8}
• If (sensor leave dirt during suck (if there is no dirt))
• then Suck  {6} (Contingency problem)
• Solution [Suck, Right, if {R, Dirty} then Suck]
E.g, People keep eyes open while moving on the street
Artificial Intelligence

Logical Agents

Dr. Muhammad Awais


Logical agents
•Agents that can form representation of the world

•Use process of inference to derive new information about

the world

•Use the new information to deduce what to do


Motivation for Knowledge based (KB) agents
• Knowledge and reasoning are necessary for successful
behavior (hard to achieve otherwise)
• Agents with the knowledge of action outcome may perform
well
• A reflex-agent can hardly find path from Arad to
Bucharest
• Knowledge of problem solving agent is specific and
inflexible
• E.g., chess program calculates legal moves but does
not know that a piece can be at two places at a time
• Knowledge and reasoning play a crucial role in partially
observable environments
Motivation for Knowledge based (KB) agents
•Agents combines knowledge with current percept to infer the
hidden state
• Doctor infer a disease state (not directly observable)
• Natural language understanding (intention as hidden
state)
• E.g., John threw brick through window and broke it
•Knowledge based agents are flexible
• Accept new tasks described explicitly
• By being told or learning achieve competency
• Adapt to changes by updating the concerning knowledge
KB Agents
• Central component of KB agent is KB
• KB is a set of sentences
• Related but not identical to English sentences
• Knowledge representation language is used for
representation
• Sentence represent an assertion about the world
• TELL (standard) to add new sentences to KB
• ASK (standard) to query from KB
• TELL & ASK involve inference
• Driving new sentences from old
• Inference obeys fundamental requirement
General KB Agent
Input : percept
Output : action
Given : KB (initially background knowledge)
Procedure :
• Tells KB about precepts
• Ask KB about action to perform
• Extensive reasoning can be performed for action
selection w.r.t current state of world
• Tells KB about selected action and execution
• MAKE-PERCEPT-SENTECE(precept, t)
• Returns a sentence asserting the received
percept at time t
• MAKE-ACTION-QUERY(t)
• Returns a sentence that ask what action to
perform at time t
• MAKE-ACTION-SENTECE(precept, t)
• Returns a sentence asserting the action executed
General KB Agent
General KB Agent
• KB agent is different from other known agents
• KB agent follows the KB and not an arbitrary program
• Declarative approach : Knowledge representation by
sentences
• Procedural approach : Desired behavior directly as
program code
Wumpus World
• rooms connected by passage ways in a cave

• Exist a Wumpus in a room that eats anyone

• Agent can shot Wumpus with an arrow but has one

arrow

• Some rooms contain bottomless pits

• A room contains gold


Wumpus World (Precise definition)

Performance measure :

• +1000 for picking the gold

• -1000 for falling into pit/eaten by Wumpus

• -1 for taking an action

• -10 for using the arrow


Wumpus World (Precise definition)

Environment

• 4x4 grid of rooms

• Agent starts in [1,1]

• Gold & Wumpus locations is uniformly random

• A square can have a pit with 0.2 probability


Wumpus World (Precise definition)
Actuators

• Agent can move forward, turn left or right at 90

degree

• Agent dies if enter in a square having pit / Wumpus

• Moving front has no effect if wall is in front

• Grab is used to pick gold

• Shoot (action) is used to fire arrow


Wumpus World (Precise definition)
Sensors
• Square containing Wumpus and in adjacent (not
diagonal) agent perceives stench
• Square adjacent to pit, Agent perceives breeze
• Square with gold, Agent perceives glitter
• Walk into wall, Agent perceives bump
• Wumpus killed, it emit scream, heard in all squares
• Agent precepts can be
• [Stench, Breeze, None, None, None]
Wumpus World
KB exploration by Wumpus Agent
• Agent’s initial KB contains rule of environment
• KB increases as agent explores and infers
• KB  [1,1] is safe
• Precepts at [1,1]
• [None, None, None, None, None]
• Neighboring squares [col,rows] [1,2], [2,1] are safe
• Agent detect breeze in [2,1]
•  pit in [2,2] or [3,1] or in both
• Safety conscious agent moves to [1,1] and then to [1,2]
• Precepts in [1,2]
• [Stench, None, None, None, None]
KB exploration by Wumpus Agent
KB exploration by Wumpus Agent
Fundamental of logical representation and reasoning
1. Syntax
2. Semantic
3. Truth
4. Possible world
5. Model
6. Entailment
7. Soundness
8. Truth preserving
9. Completeness
Fundamentals
• Syntax (agreed way for statement description)
• KB consists of sentences
• Sentences are described / scripted according to given
• Syntax of ordinary arithmetic
• x+y = 4
• Arithmetic without system x2y+=
• Usage of different symbols (Greek letters, mathematical) for logical
languages
• KB sentences for an agent are physical configuration of agent
• E.g, different action configuration concerning precepts
• Reasoning corresponds to generation and manipulation of the
configurations
Fundamentals
• Semantics
• Informal definition = meaning of a sentence in KB
• Formal : truth of each sentences € KB for each possible world
• Truth
• Arithmetic sentence x + y = 4 is true for each world (configuration)
where x = 2 and y = 2s
• Model
• Precise description of possible world
• m is a model of α
• Sentence α is true in model m
• E.g., x + y = 4
• x = men and y = women
• Possible models are all possible assignments of x and y
• Models of a sentence (x + y = 4) are the assignments of x and y
that x + y = 4
Fundamentals

• Tautology
– A Sentence which is true in all models /
assignments
– E.g., P ∨ ¬ P
• Contradiction
– A Sentence which is false in all models /
assignments
– E.g., P ˄ ¬ P
Entailment
• We say that a sentence φ logically entails a sentence ψ (written φ ⊨ ψ) if and only if
every truth assignment that satisfies φ also satisfies ψ. More generally, we say that
a set of sentences Δ logically entails a sentence ψ (written Δ ⊨ ψ) if and only if
every truth assignment that satisfies all of the sentences in Δ also satisfies ψ.

• For example, the sentence p logically entails the sentence (p ∨ q). Since a
disjunction is true whenever one of its disjuncts is true, then (p ∨ q) must be true
whenever p is true. On the other hand, the sentence p does not logically entail (p ∧
q). A conjunction is true if and only if both of its conjuncts are true, and q may be
false. Of course, any set of sentences containing both p and q does logically entail
(p ∧ q).

• Once again, consider the case of (p ∧ q). Although p does not logically entail this
sentence, it is possible that both p and q are true and, therefore, (p ∧ q) is true.
However, the logical entailment does not hold because it is also possible that q is
false and, therefore, (p ∧ q) is false.
Entailment : Propositional Example

Conjunction Disjunction Tautology Contradiction


Propositional
Symbols

P Q P˄Q P∨Q ¬P ∨ P ¬P ˄ P
T T T T T F
F T F T T F
F F F F T F
T F T T T F
Entailment : Propositional Example

Conjunction Disjunction Tautology Contradiction


Propositional
Symbols

P Q P˄Q P∨Q ¬P ∨ P ¬P ˄ P
T T T T T F
F T F T T F
F F F F T F
T F T T T F
Entailment : Propositional Example
Entailment : Propositional Example

Conjunction Disjunction Tautology Contradiction


Propositional
Symbols

P Q P˄Q P∨Q ¬P ∨ P ¬P ˄ P
T T T T T F
F T F T T F
F F F F T F
T F F T T F
Entailment : 2nd Propositional Example

¬ Conjunction Conjunction Disjunction Tautology Contradiction


Propositional
Symbols

P Q ¬(P ∨ Q) P˄Q P∨Q ¬P ∨ P ¬P ˄ P


T T F T T T F
F T F F T T F
F F T F F T F
T F F F T T F
Entailment :
Propositional Logic (Boolean logic)

• Syntax & Semantics determination of truth of sentence

• Entailment

• Relation of following b/w two sentences


Syntax
• Atomic sentences: Indivisible syntactic element
• Single propositional symbol
• Symbols represent / stands for a proposition / assertion
• Assertion / proposition = true / false
• Symbols representation by uppercase letters
• P, Q, R, etc.
• Independent symbols but may have mnemonic values,
e.g.
• W1,3 represents / stands for proposition that
Wumpus is in [1,3]
• W1,3 is atomic and W, 1 and 3 are not component of
W1,3
• Two fixed meaning proposition symbols
1. True : Always proposes / asserts true
2. False : Always proposes / asserts false
Complex sentences (CS)
• CS construction by the logical connection of simple (may be atomic)
sentences
• Logical connectors
• Negation (¬) (not)
• Application : applied before a sentences , e.g., ¬ W1,3
• Result : After application it inverts the proposition / assertion of
sentences, e.g.,
• W1,3 : Wumpus is in [1, 3]
• ¬ W1,3 : Wumpus is not in [1, 3]
• Conjunction : (Λ) (And)
• Application : applied between two sentence, e.g.,
• Conjunction : (atomic sentence) W1,3 Λ (atomic sentence) P3,1 (pit in cell
[3,1])
• Conjuncts : W1,3 Λ P3,1
• Result : All the conjuncts (part of conjunction) must be true for the
sentence (made by Λ) to be true

P Q
T T T
F T T
F F T
T F F
P Q
T T T
F T T
F F T
T F F
P=T Q=T T

P=T Q=F F

P=F Q=T T

P=F Q=F T

S# P Q
1 T T T

Doesn't conflict with Universal Truth 2 F T T

Doesn't conflict with Universal Truth 3 F F T


Doesn't match with Universal Truth 4 T F F
Universal Truth / Initial Fact
P=T Q=T T

P=T Q=F F

P=F Q=T T

P=F Q=F T

Doesn't conflict with Given Fact , as the S# P Q


Wumpus can not be in adjacent to [1,2] 1 T T T
Doesn't conflict with Given Fact , as the
Wumpus can be in [2,2], or adjacent to [1,2] 2 F T T

Doesn't match with Given Fact 3 F F T


Given Fact 4 T F F
P Q
T T T T
F T T T
F F T T
T F F F

• Implication is also written as P only if Q


• P implies Q
P Q
T T T
F T F
T F F
F F T
To combine two sets, conditions, or
expressions by a logical AND [wiki]

P=T Q=T P→Q=T

Q=T P=F Q→P=F


Relation between Implication & Bidirectional
• If A person is UK PM → He / She lives in PM house in 10 Downing street TRUE
P=T Q=T P→Q=T

• IF A Person lives in PM house in 10 Downing street → He / She is UK PM TRUE


Q=T P=T Q→P=T

• If P → Q ˄ Q → P then P ↔ Q means UK PM ↔ 10 Downing street


Necessary Condition

Q must be true, i.e., Q = T (a necessary condition)


Sufficient Condition
• If P then Q, P ⇒ Q (conditional statement),
• P = Premises / Antecedents, Q = consequent
• Sufficient Condition: P is considered as sufficient condition for Q
• If the conditional statement (P ⇒ Q) is true
• And the consequent Q is also true
• Then the Antecedent P can be true / false
• P is the sufficient condition for Q, i.e.,
• The value of P is not required to have some specific value for Q
• P Q
T T T
F T T
• Example : Trains runs according to schedule (Wikipedia)
• Getting aboard on a train assures that one can reach on time
• Getting aboard on train is not necessary, as there are other means to reach
on time, e.g., vehicles, airplanes, etc.
• Problem with track, etc. can make train out of schedule.
• Trains are sufficient but not necessary condition to reach on time.
Necessary & Sufficient Condition
• If P then Q, P ⇒ Q, P = Premises / Antecedents, Q = consequent
• Necessary Condition: From Previous slide we know Q is necessary condition for P
• i.e., if P = T then Q must be true, i.e., Q = T
• Necessary & Sufficient Condition: For the Necessary and Sufficient condition to hold /
true,
• if P = T then Q must be true, i.e., Q = T , i.e., P ⇒ Q (previous slide) and
• if Q = T then P must also be true, i.e., P = T, i.e., Q ⇒ P (as above)
• i.e., P ⇒ Q ∧ Q ⇒ P (Necessary and Sufficient Condition)
• In P ⇒ Q, Q is a necessary condition for P
• In Q ⇒ P, Q is a sufficient condition for P
• Similarly in Q ⇒ P, P is a necessary condition for Q
• in P ⇒ Q, P is a sufficient condition for Q
• Necessary & Sufficient condition is also described as P if and only if Q
• P only if Q (P ⇒ Q) and Q only if P (Q ⇒ P)
• And symbolically written as P ⟺ Q
• Example: P = Brother Q = sibling male
• If P = T then Q = T TRUE
• If P = F then Q = T FALSE
• If P = T then Q = F FALSE
• If P = F then Q = F TRUE
Logical operator precedence
• High to low
• ¬, Λ,, →, ↔
• Sentence ¬P OR Q Λ R → S
• ((¬P) OR (Q Λ R)) → S
• Ambiguity
• A Λ B Λ C  ((A Λ B) Λ C)
• A Λ B Λ C  (A Λ (B Λ C))
• Allowed Ambiguities
• AΛ B Λ C
• A B C
• A↔B↔C
• Not Allowed Ambiguities
• A→B→C
KB Construction
• Wumpus world example
• Deal only with pits
• Vocabulary of proposition symbols
• Pi,j : pit in [i, j]
• Pi,j = true if [i, j] is pit and vice versa
• Bi,j = Breeze in [i, j]
• Bi,j = true if [i, j] is pit and vice versa
• R1 : ¬ P1,1
• R2 : B1,1 ↔ (P1,2 V P2,1) [1]
• R3 : B1,1 ↔ (P1,1 V P2,2 V P3,1) [1]
• R4 : ¬ B1,1
• R5 : B2,1
• KB : R1,…,R5
• KB : R1 Λ … Λ R5

• [1] R2 is stated as per general statement / sentence that if there is a Breeze in a cell then the
neighboring cells (one / more) may have Pit.
Inference:
• Table Enumerates all the models (assignments) of KB (sentences)
• 2n=7 = 128 models, n = number of Propositions in the KB

• Models of the KB as KB is TRUE in these models


General Algorithm for Entailment
Decision in Propositional Logic
General Algorithm for Entailment
Decision in Propositional Logic
General Algorithm for Entailment
Decision in Propositional Logic
General Algorithm for Entailment Decision in
Propositional Logic
Equivalence, Validity and satisfiability
• Equivalence :
• Sentences : α, β
• α↔β : if α and β are true in similar models
• E.g., α : P Λ Q , β : Q Λ P
• α≡β if and only if α |= β and β |= α
• Validity :
• α (sentence) is valid if true in all models
• P OR ¬P is valid
• P OR ¬P is tautology
• Usage : deduction
• α, β, α |= β if α → β
• E.g., P OR ¬P
• Satisfiability :
• α (sentence) is satisfy-able if true for modeli i=1,...,n n>1
• if α satisfies m then m is a model of α
Inference rules
Modus Ponen (MP) :
• given α, β sentences as follows
• α→β,α
• β
And Elimination (AL)
• αΛβ
• α
MP and AL are sound [1]

Bidirectional elimination :
• α↔β
• (α → β) Λ ( β → α)

• [1] An inference algorithm that derives only entailed sentences is called Sound or Truth-preserving
Example inference
• KB : R1 Λ R2 Λ R3 Λ R4 Λ R5
• R1 : ¬ P1,1
• R2 : B1,1 ↔ (P1,2 V P2,1)
• R3 : B1,1 ↔ (P2,1 V P2,2 V P3,1)
• R4 : ¬ B1,1
• R5 : B2,1
• Objective : Prove no pit in [1, 2] , i.e., ¬ P1,2
• Bidirectional elimination on R2
• R6 : (B1,1 → (P1,2 V P2,1)) Λ ((P1,2 V P2,1) → B1,1)
• And elimination
• R7 : ((P1,2 V P2,1) → B1,1)
• Logical equivalence for contra positive
• R8 : (¬B1,1 → ¬(P1,2 V P2,1))
• MP with percept (¬B1,1)
• R9 : ¬(P1,2 V P2,1)
• Demorgen’s rule
• R10 : ¬P1,2 Λ ¬P2,1
Resolution
• Unit resolution

• Eample : l1 or l2, ¬l2 or l3

• l1 or l3

• Take a clause (disjunction of literals) and a literal and

produces a new clause


Introduction to Prolog
Concepts & Syntax

Dr. Muhammad Awais


Outline
• Facts, Rules, and Queries
• Knowledge Base
• A collection of facts and rules is called a knowledge base
• Prolog Programs / Syntax
• Prolog programs are knowledge bases
• describe some collection of relationships
• So how do we use a Prolog program?
• By posing queries.
• by asking questions about the information stored in the knowledge base.
• Atoms
• Numbers
• Variables
• Complex terms

Dr. Muhammad Awais Department of Software Engineering, GCUF 2


Facts, Rules, and Queries

• Knowledge Base 1
• Knowledge Base 2
• Knowledge Base 3
• Knowledge Base 4
• Knowledge Base 5

Dr. Muhammad Awais Department of Software Engineering, GCUF 3


Knowledge Base 1
• A collection of Facts
• woman(mia). mia is a woman
• woman(jody). jody is a woman
• woman(yolanda). yolanda is a woman
• playsAirGuitar(jody). jody plays air Guitar
• First letter is in lower-case According to Syntax
• KB construction in prolog
• File with extension .pl Say KB1.pl
• KB1.pl contains all the facts
• Load the file in SWI-Prolog
• Clicking Consult tab in File menu

Dr. Muhammad Awais Department of Software Engineering, GCUF 4


SWI-Prolog

Dr. Muhammad Awais Department of Software Engineering, GCUF 5


SWI-Prolog

Dr. Muhammad Awais Department of Software Engineering, GCUF 6


SWI-Prolog

Dr. Muhammad Awais Department of Software Engineering, GCUF 7


SWI-Prolog
• Posing queries to KB (KB1.pl)
• Prompt symbol of SWI-Prolog
• ?-
• Statement ending character
• .
• ?-woman(mia).
• True
• A Fact explicitly put in KB1.pl
• ?-playsAirGuitar(mia).
• False
• As the KB doesn't has the fact
• ?-tatood(jody)
• Doesn’t exist in KB
• Unknown procedure

Dr. Muhammad Awais Department of Software Engineering, GCUF 8


Knowledge Base 2
• A collection of Facts and Rules
• listensToMusic(mia). Fact
• Mia listens music
• happy(yolanda). Fact
• Yolanda is happy
• playsAirGuitar(mia) :- listensToMusic(mia). Rule
• Consequent / head (Prolog) :- Ancedents / Premises / Body (Prolog) / goal (Prolog)
• Mia listens to the music If Mia plays AirGuitar
• :- is read as “if” or “is implied by”
• if the body of the rule is true, then the head of the rule is true too.
• playsAirGuitar(yolanda) :- listensToMusic(yolanda). Rule
• Yolanda listens to the music If Yolanda plays AirGuitar
• listensToMusic(yolanda):- happy(yolanda). Rule
• Yolanda is happy if Yolanda listens to the music

Dr. Muhammad Awais Department of Software Engineering, GCUF 9


Knowledge Base 2
• Rules state information
• that is conditionally true
• if a KB contains a rule head :- body
• If KB also contains the body as fact
• then Prolog can infer head.
• listensToMusic(mia). Fact
• playsAirGuitar(mia) :- listensToMusic(mia). Rule
• Prolog can infer playsAirGuitar(mia) as true
• Deduction step is what logicians call modus ponens.

Dr. Muhammad Awais Department of Software Engineering, GCUF 10


Knowledge Base 2

Dr. Muhammad Awais Department of Software Engineering, GCUF 11


Knowledge Base 2
• playsAirGuitar(mia).
• true
• Since KB2 contains
• listensToMusic(mia). Fact
• playsAirGuitar(mia) :- listensToMusic(mia). Rule
• Using Modus Ponens, Prolog can infer
• playsAirGuitar(mia)
• Since playsAirGuitar(mia) doesn't exist as Fact in KB2
• playsAirGuitar(yolanda)
• True
• Since KB2 contains
• happy(yolanda). Fact
• listensToMusic(yolanda):- happy(yolanda). Rule
• Prolog infers listensToMusic(yolanda) using Modus Ponens
• Prolog uses the result of Modus Ponens (above) and again applies Modus Ponens
• Previous the playsAirGuitar(yolanda) :- listensToMusic(yolanda). Rule
• To infer playsAirGuitar(yolanda)

Dr. Muhammad Awais Department of Software Engineering, GCUF 12


Knowledge Base 3
• happy(vincent). Fact

• listensToMusic(butch). Fact

• playsAirGuitar(vincent) :- listensToMusic(vincent) , happy(vincent). Rule


• Body has two Goals / Premises / Fact / Proposition

• , represents the conjunction of the two Propositions / Conditions / Premises / goals

• Similarly ; is or operator or represents a disjunction

• playsAirGuitar(butch):-happy(butch). Rule

• playsAirGuitar(butch):-listensToMusic(butch). Rule

Dr. Muhammad Awais Department of Software Engineering, GCUF 13


Knowledge Base 3
• ?-playsAirGuitar(vincent).
• False

• Only related Fact in KB3

• happy(vincent).

• Goal / body / Premises has a conjunction

• Both of the Propositions / Goals / body / Antecedents / Premises must be true

• listensToMusic(vincent) doesn’t exist as Fact and also cannot be inferred

• Therefore Prolog return false


Dr. Muhammad Awais Department of Software Engineering, GCUF 14
Knowledge Base 3
• ?-playsAirGuitar(butch)
• True
• Since following exists in KB3
• playsAirGuitar(butch):-listensToMusic(butch)
• listensToMusic(butch)
• Therefore Modus Ponens helps Prolog infer the head / Consequent of rule
• Rules 1 & 2 can be represented as a single disjunction based (body) rule 3.
1. playsAirGuitar(butch):- listensToMusic(butch)
2. playsAirGuitar(butch):-happy(butch).
3. playsAirGuitar(butch):-happy(butch); listensToMusic(butch).
Disjunction

Dr. Muhammad Awais Department of Software Engineering, GCUF 15


Knowledge Base 4
• woman(mia). Facts
• woman(jody). Facts
• woman(yolanda). Facts
• cares(vincent,mia). Relation / Predicate
• A Relation / Predicate as it returns True / false

• cares(marcellus,mia). Relation / Predicate


• cares(pumpkin,honey_bunny). Relation / Predicate
• cares(honey_bunny,pumpkin). Relation / Predicate

Dr. Muhammad Awais Department of Software Engineering, GCUF 16


KB4: using Variables
• Usage of Upper-case letter in Prolog as variable : X
• it’s a “placeholder” for information.
• That is, the query
• ?-woman(X)
• essentially asks Prolog: tell me which of the individuals you know about is a woman.
• Prolog answers this query by working its way through KB4
• from top to bottom, trying
• to match (or unify) the expression woman(X) with the information KB4 contains.
• First item in the knowledge base is woman(mia).
• Prolog matches X to mia
• thus making the query agree perfectly with this first item.
• we can also say that Prolog instantiates X to mia,
• or that it binds X to mia.
• Prolog then reports back to us without just saying ‘true’, as follows:
• X = mia

Dr. Muhammad Awais Department of Software Engineering, GCUF 17


KB4: using Variables
• If Further match of the query woman(X) exists
• Then that information can be accessed by typing the following
• ?- ;
• Pressing space bar (on keyboard)
• Remember that ; means or,
• so this query means: are there any more women?
• So Prolog begins working through the knowledge base again
• (it remembers where it got up to last time and starts from there)
• and sees that if it matches X with jody, So it responds:
• X = jody ; and next
• X = Yolanda.
• And Stops, if the match is not found

Dr. Muhammad Awais Department of Software Engineering, GCUF 18


KB4: cares(marcellus,X),woman(X).
• Select an individual X such that Marcellus cares about X and X is a woman
• X = mia.
• Since X = mia makes both Facts and Predicate True
• KB4:
• woman(mia).
• woman(jody).
• woman(yolanda).
• cares(vincent,mia).
• cares(marcellus,mia).
• cares(pumpkin,honey_bunny).
• cares(honey_bunny,pumpkin).

Dr. Muhammad Awais Department of Software Engineering, GCUF 19


Knowledge Base 5
• cares(vincent,mia). Relation

• cares(marcellus,mia). Relation

• cares(pumpkin,honey_bunny). Relation

• cares(honey_bunny,pumpkin). Relation

• jealous(X,Y) :- cares(X,Z),cares(Y,Z). Rule

Dr. Muhammad Awais Department of Software Engineering, GCUF 20


KB5 jealous(X,Y) :- cares(X,Z),cares(Y,Z)
• Definition a concept of jealousy.

• It says that an individual X will be jealous of an individual Y

• if there is some individual Z that X cares, and Y also cares the same
individual Z

• A general Rule (statement)


• not stated in terms of a special person (mia / pumpkin / anyone)

Dr. Muhammad Awais Department of Software Engineering, GCUF 21


KB5 jealous(X,Y) :- cares(X,Z),cares(Y,Z)
• ?- jealous(marcellus,Q).
• Q = vincent.

• Since cares (Vincent, mia) is first Fact in KB5


• Even for ?- jealous(Vincent ,P).
• P = Vincent.

• For ?-jealous(V,U).
• Since cares (Vincent, mia) is first Fact in KB5
• V = U, U = vincent;
• V = vincent;
• U = marcellus;
• It keeps on displaying values of U and V in loop
• Until . is pressed

• If press . Prolog stops


• If press ; Prolog displays the next value that fulfills the condition, i.e.,
• Jealous(vincent,Q).
• Q = vincent;
• Q = marcellus.

Dr. Muhammad Awais Department of Software Engineering, GCUF 22


Prolog Syntax
• Facts, Rules, and Queries built out of Terms
• Four kinds of terms in Prolog: atoms, numbers, variables, and complex terms
• Atoms and numbers are lumped together under the heading constants
• Constants and variables together make up the simple terms of Prolog.
• Basic characters (or symbols) available in Prolog
• The upper-case letters are A, B, ..., Z;
• The lower-case letters are a, b, ..., z;
• The digits are 1, 2, ..., 9;
• The special characters are +, -, *, /, <, >, =, :, ., &, ~, and _.
• The blank space is also a character, but a rather unusual one, being invisible.
• A string is an unbroken sequence of characters.
Dr. Muhammad Awais Department of Software Engineering, GCUF 23
Atom:
• An atom is either:
1. A string of characters made up of
1. upper-case letters,
2. lower-case letters,
3. digits,
4. underscore character, that begins with a lower-case letter.
1. For example: butch, big_kahuna_burger, and m_monroe2.

2. An arbitrary sequence of character enclosed in single quotes.


1. For example ’Vincent’,’The Gimp’, ’Five_Dollar_Shake’, ’&^%&#@$ &*’, and ’ ’.
1. The character between the single quotes is called the atom name.
2. Note that we are allowed to use spaces in such atoms.

3. A string of special characters are all atoms, e.g.,


1. @=
2. ====>
3. ;
4. :-

Dr. Muhammad Awais Department of Software Engineering, GCUF 24


Numbers
• Integers (that is: ... -2, -1, 0, 1, 2, 3, ...) are useful for tasks

• counting the elements of a list

• Prolog syntax is the obvious one:

• 23, 1001, 0, -365, and so on.

Dr. Muhammad Awais Department of Software Engineering, GCUF 25


Variables
• A variable is a string of upper-case letters, lower-case letters, digits and
underscore
• Characters that starts either with an upper-case letter or with
underscore. For example,
• Example Prolog variables
• X, Y, Variable, _tag, X_526, and List, List24, _head, Tail, _input and Output
• The variable _ (that is, a single underscore character) is rather special.
• It’s called the anonymous variable

Dr. Muhammad Awais Department of Software Engineering, GCUF 26


Complex terms
• Complex terms are build out of a Functor
• Followed by a sequence of arguments.
• Complex terms are often called structures
• Arguments are put in ordinary brackets, separated by commas,
• Placed after the Functor.
• Arguments can be any kind of term
• Functor must be an atom.
• That is, variables cannot be used as functors.
• Examples of complex terms in KB1 to KB5.
• playsAirGuitar(jody) is a complex term: functor = playsAirGuitar, argument = jody.
• cares(vincent,mia) is a complex term: : functor = cares, argument = jody , mia
• jealous(marcellus,W) is a complex term: functor = jealous, argument = marcellus, W

Dr. Muhammad Awais Department of Software Engineering, GCUF 27


Complex terms
• A Perfect complex term: hide(X,father(father(father(butch))))
• Its functor is hide,
• it has two arguments:
• variable X
• complex term father(father(father(butch))).
• This complex term has father as its functor
• It has sole argument, i.e.,
• (father(father(butch))),
• The sole argument is a functor father
• It has again sole argument which is again a functor, i.e.,
• father(butch)
• The sole argument is a functor father
• It has again sole argument which is constant, i.e.,
• butch

Dr. Muhammad Awais Department of Software Engineering, GCUF 28

You might also like