0% found this document useful (0 votes)
7 views

Ch 04- Knowledge and Reasoning

The document discusses knowledge-based agents, their architecture, and the levels of knowledge representation in artificial intelligence, including propositional and predicate logic. It covers various methods for representing knowledge, such as semantic networks and frame representation, and explains inference techniques like forward and backward chaining. Additionally, it addresses planning in AI, emphasizing the importance of decision-making and action sequences to achieve goals.

Uploaded by

SHREYA BHUVAD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Ch 04- Knowledge and Reasoning

The document discusses knowledge-based agents, their architecture, and the levels of knowledge representation in artificial intelligence, including propositional and predicate logic. It covers various methods for representing knowledge, such as semantic networks and frame representation, and explains inference techniques like forward and backward chaining. Additionally, it addresses planning in AI, emphasizing the importance of decision-making and action sequences to achieve goals.

Uploaded by

SHREYA BHUVAD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 68

Ch 04

Knowledge and Reasoning


CSC 703

Prof. Poonam Narkhede


Knowledge, reasoning, and planning
 Knowledge based agent:

Inference engine Domain- independent algo.

Knowledge base Domain- specific content

 Knowledge level:
- Facts about environment
 Implementation level:
- Uses kb and algorithms
- E.g. logic
- Taking any action kb agent use existing knowledge
along current input
- Single fact is sentence, expressed as formal
representation language
- TELL and ASK mechanism
- TELL agent about surrounding
- Agent ASK itself what action to be taken
Architecture of KB agent
 Knowledge level:
- Agent describe what it knows
- Knowledge is define at this level
- Specify what agent knows and goal of program
- E.g taxi driving agent
 Logical level:
- Knowledge is encoded in sentence
- Languages used: propositional and predicate
- E.g likes(ram, apple)
 Implementation level:
- Physical level to logical level conversion

- States and action, new percepts, internal representation,


properties, appropriate actions
Logic
 Also called as reasoning
 Logical representation and reasoning are independent

of form of logic
 It is advantageous when know ledge is represented in

small extent
 Role of reasoning in AI:
- Proposition logic are represented values between 0 & 1
and 2nd level is predicate logic

- Logics are basic building blocks of AI


Representation of knowledge using rules
 Knowledge can be represented in two levels:
- Knowledge level
- Symbol level
 4 rules can be used:
- Logical representation
- Production rule representation
- Semantic networks
- Frame representation
 Logic representation:
- Related to truth of statement
- Used to infer new sentences from existing one
- There are several ways are used:

1. Propositional logic:
- Use of propositions which is either true or false
- e.g. leaves are green, Violets are blue
2. Predicate logic:
- Makes use of variable, constants, predicates,
functions and quantifiers
3. Higher order logic:
- Uses additional quantifiers
4. Fuzzy logic:
- Value between TRUE and FALSE

5. Other logic:
- Multi-value logic, modal logic, temporal logic
 Production rule representation:
- If-Then rule

- IF pressure is high, THEN volume is low


- IF road is slippery THEN driving is dangerous
 Semantic Networks:
- Represent knowledge in the form of graph
- Made up of nodes represents concept, link which
represents relation
- Tom is a cat Tom caught a fish
- Tom is grey in colors Tom is owned by sam
- Tom is a mammal Fish is an animal
- Cats love milk All mammals are animals
 Frame representation:
- Used when task becomes complex and need more
structure representation
- Consist of slots or attributes and their values
- Slots have name and values, e.g.
- (Ram)
- (AGE(value 50))
- (ADDRESS(STREET(value 4c gb road)))
- CHILDRENS(value luv, kush)
Propositional Logic (PL)
 Atomic logic formulas are propositions
 It is based on truth values

 Syntax:
- Symbols are represented as capital letter A,B,C…
- Constants having truth values 0 and 1
- ‘(…)’ used this for atomic sentences
- Literals of any sentence ~A
- Relation between two propositions A ∧ B, A ∨ B
 Semantics:
- Set of facts used to represents in a form of propositional
logic
- E.g. A means “It is hot”
B means “It is humid”
- Number of rules used e.g. (A ∧ B), (A ∨ B)
- (A ∧ B), (A ∨ B) have same meaning but in natural
language may not,
e.g. Radha started feeling feverish and Radha went to dr.
- PL is relation between truth values of one stmt to other
 PL sentences-example:
- Symbols can be defined as,

1. A= “It is hot”
2. B= “It is humid”
3. C= “It is raining”
- We can select a symbol which is easy understand,
1. HT= “It is hot”
2. HM= “It is humid”
3. RN= “It is raining”
- “It is humid then it is hot” = HM → HT
- “If it is humid and hot then it is raining” =
(HT ∧ HM) → RN
 Advantages:
- Simple
- Efficient
- Foundation for higher logic
- Reasoning is decidable
- Inference is easy
 Disadvantages:
- Problem in complex queries
- Impractical
- Lengthy
- Weak representation language
- Can represent properties of entity
First order predicate logic
 Represent relation using relation, variables and
quantifiers
 Gorila is black = Gorila(x) → Black(x)
 Syntax:
- “x” is domain value
- Constant: fixed value
- Variable: assigned value
- Function: function of any arguments
- Connectives are same
 Quantifiers:
- Quantify the no. of variables

1. Universal quantifiers ∀ :
- “for all”
- ∀x:A means A is true for all values of x
2. Existential quantifiers ∃ :
- “there exist”
- ∃x: A means A is true for at least one value of x
- E.g. there is a white dog = ∃x: (Dog(x) ∧ White(x))
Sr. Propositional logic Predicate logic
No.
1 Cannot represent small worlds Very well represents
2 Weak knowledge representation Strong knowledge representation
3 Uses propositions Uses predicate
4 Can’t represent properties Represent properties
5 Can’t express specialization, Can express specialization,
generalizations generalizations
6 Foundation level logic Higher level logic
7 Insufficient in expressive Sufficient in expressive
8 Assumes worlds contain facts Assumes world contain object
relation, functions
9 Context independent Context dependent
10 Declarative Derivative
Resolution
 Inference rule
 New clause generated by using two clauses containing
literals
 Literals are atomic symbol or negation of that
 It builds sound, complete theorem

 Resolution procedure:
- Convert english statement to logical form
- Convert logical statement into CNF
- Negate the conclusion
- Infer the two complementary literals by using resolution tree
 Convert from logic to CNF (Clausal Normal Form)
- Eliminate implication (→)
e.g. P →Q = ~P V Q
- Eliminate and (∧)
e.g. P ∧ Q = i) P
ii) Q
- Eliminate Universal Quantifiers (∀)
∀x ∀y ∀z: given( x,y,z) = given( x,y,z)
- Eliminate Existential Quantifiers (∃ )
∃x: P1(x) V P2(x) = P1(f(x)) V P2(f(x))
 Consider the following facts
1. Rimi is hungry
2. If Rimi is hungry she barks
3. If Rimi is barking then Raja is angry

Explain statements in propositional logic.


Convert them into CNF
Prove that the stmt “Raja is angry” using resolution
Step 1: converted into propositional logic
1. Hungry(Rimi)
2. Hungry(Rimi) → barks(Rimi)
3. Barks(Rimi) → angry (Raja)

Step 2: converted to CNF


4. Hungry(Rimi)
5. ~ Hungry(Rimi) V barks(Rimi)
6. ~ Barks(Rimi) V angry (Raja)

Step 3: Negate the conclusion


T.P.T Angry(Raja)
Negation: ~ Angry(Raja)
Hence, “Raja is angry” is
proved to be true
 Consider the following facts
1. If maid stole the jewellery then butler was not guilty
2. Either maid stole jewellery or she milked the cow
3. If maid milked the cow then butler got the cream
4. Therefore if butler was guilty then he got the cream
P.T conclusion (step 4) is valid using resolution

Assumptions:
a= maid stole jewellery
b= butler was guilty
c= maid milked cow
d= butler get cream
 Step 1: converted into propositional logic
1. a → ~ b

2. a V c

3. c → d

To prove that,
b→d
 Step 2: converted into CNF

1. ~ a V ~b

2. a V c

3. ~ c V d

4. ~ b V d
 Step 3: Negate the conclusion

~ (~ b V d) = b ∧ ~d = i] b
Hence, conclusion is proved to be true
 Consider the following facts
1. Ravi likes all kind of food
2. Apple and chicken are food
3. Anything anyone eats and is not killed, Is food
4. Ajay eats peanuts and is still alive
5. Rita eats everything that ajay eats

P.T. “Ajay eats peanuts” using resolution.


What does Rita eats?
 Step 1: converted into predicate logic
1. ∀ x: food(x) → likes(Ravi,x)
2. food(apple) ∧ food(chicken)
3. ∀ x ∀ y: eats(x,y) ∧ ~killed(x) → food(y)
4. eats( Ajay, peanuts) ∧ alive(Ajay)
5. ∀ x: eats(Ajay, x) → eats(Rita, x)

 Add new predicate:


1. ∀ x: ~killed(x) → alive(x)
2. ∀ x: alive(x) → ~killed(x)
 Step 2: converted into CNF
1. ~ food(x) V likes(Ravi, x)
2. food(Apple)
3. food(Chicken)
4. ~ eats(x,y) V killed(x) V food(y)
5. eats(ajay, peanuts)
6. alive(ajay)
7. ~ eats(ajay, x) ~ eats(rita, x)
8. Killed(x) V alive(x)
9. ~ alive(x) V ~ killed(x)

 Step 3: Negate the conclusion


~ likes(Ravi, Peanuts)
Hence, “Ravi likes peanuts” is proved
to be true
 What does rita eats?
 Consider the following facts
1. Ram went to temple
2. The way to temple is, walk till post box and take left
or right road
3. The left road has a ditch
4. Way to cross the ditch is to jump
5. A log is across the right road
6. One need to jump across the log to go ahead

Prove that the stmt “Ram did not jump” is false


Step 1: converted into predicate logic
1. At(Ram,temple)
2. ∀x: At(x,temple) => At(x,postbox)∧takeleft(x)
3. ∀x: At(x,temple)=>At(x,postbox) ∧takeright(x)
4. ∀x:takeleft(x) => cross(x,ditch)
5. ∀x:cross(x,ditch) => jump(x)
6. ∀x:takeright(x) => at(x,log)
7. ∀x:at(x,log) => jump(x)

Conclusion: ~jump(Ram)
Step 2: converted to CNF
1. At(Ram,temple)
2. ~At(x,temple) V At(x,postbox)
3. ~At(x,temple) V takeleft(x)
4. ~At(x,temple) V At(x,postbox)
5. ~At(x,temple) V takeright(x)
6. ~takeleft(x) V cross(x,ditch)
7. ~cross(x,ditch) V jump(x)
8. ~takeright(x) V at(x,log)
9. ~At(x,log) V jump(x)

Step 3: Negate the conclusion: jump(Ram)


~jump(Ram)
Step 4: Resolve using resolution tree

Hence, “Ram did not


jump” is proved to be
false
 Consider the following facts
1. All people who are graduating are happy
2. All happy people smile
3. Someone is graduating
P.T. “Someone is smiling” using resolution

Step 1: Converted into predicate logic


4. ∀x: graduating(x) => happy(x)
5. ∀x: happy(x) => smile(x)
6. ∃x: graduating(x)
Step 2: Converted into CNF
1. ~graduating(x) V happy(x)
2. ~happy(x) V smile(x)
3. Graduating(x2)

Step 3: Negate the conclusion


Someone is smiling: ∃x: smile(x) = smile(x3)
~smile(x3)

Step 4: solve using resolution tree


Hence, “Someone is
smiling” is proved to be
true
Inference(Forward and backward
chaining)

 Forward Chaining:
- When decision is based on available data
- Works from initial state and continue until one of the
goal state
- E.g. “if it is raining then we will take umbrella” this is
called forward chaining
- It is also called data driven inference technique
 E.g. human(a) => mortal(a)
if human(Mandela) is true then
mortal(Mandela) is also true

Fig.1 Forward
chaining
 Consider following facts and prove that “Colonel west
is an criminal” is true
1. It is a crime for an American to sell weapons to the
enemy nations
- American(x) ∧ weapon(y) ∧ enemy(z, america)
=> criminal(x)
2. Country nano is an enemy to America
- enemy(nano, America)
3. Nano has some missiles
- owns(nano,x)
- Missile(x)
4. All the missiles were sold to nano by colonel west
- missile(x) ∧ owns(nano,x) => sell(west,x,nano)

5. Missile is a weapon
- missile(x) => weapon(x)

6. Colonel west is American


- American(west)

Step 2: proof by chaining


Hence, It is proved that ”Colonel west
is an criminal” using forward chaining
 Backward chaining
- Based on decision the initial data is fetched
- Starts from final state and continue till one of the given
facts
- Also called goal driven inference
- E.g. human(x) => mortal(x)
if mortal (mandela) is true then
human(mandela) is also true
Fig.2. Backward
Chaining
 Same e.g using backward chaining
Planning
 Part of day to day activity
 Must learn about scheduling algorithm
 Plan for creating AI based agent
 Also decide priorities
 Learning how m\cs can become more intelligent
 Planning: decision making by IS
 E.g robot, program
 Planning: sequence of action to achieve target
 Collect information for problem
 Have information about initial state, goal state and

actions
 Aim: to achieve a goal state by proper sequence of

actions
Simple planning agent

 The e.g of coffee maker, printer and mailing system,


assumes there are 3 people who access agent
 All 3 users of an agent give commands to execute task
 Then decide the sequence of actions
 Fig.3 shows how planning agent interact with

environment
Fig. 3 Planning agent
Planning Problem
 To achieve a goal,
- What will be effect of action
- What kind of environment
 E.g. tic-tac-toe game, while taking next step, has to

consider old step and has to imagine future action of


opponent
 Assumption about task environment,

fully observable, deterministic, finite, static, discrete


 Aim is to find sequence of action
 Goal can be a union of sub goals
 E.g ping pong game, since best 3 of 5 matches where to

win match you have to win 3 games and in every game


win minimum of margin of 2 points
Problem solving and planning
 Main difference between solving and planning, agent
follow logic based representation
 Planning agent has states, goal, and actions
 Agent have information about past actions, present

action and effect of action


 Planning= problem solving + logical representation
Partial order planning
 In POP, ordering of actions is partial, also wont specify
actions sequence.
 Problem is decomposed
 E.g. wearing shoes

Fig.4 POP of wearing shoes


POP as search problem
 States are small plans
 States are unfinished actions
 Every plan has 4 components

1. Set of actions
2. Set of ordering preconditions
3. Set of casual links
4. Set of open preconditions
1. Set of actions:
 Steps of plan
 E.g. set of actions= { start, rightsock, rightshoe,
leftsock, leftshoe, finish}

2. Set of ordering precondition:


 Precondition is related to ordering
 E.g consider e.g of wearing shoes
 If condition is cyclic, represents inconsistency
 For consistent plan, should not be any cycle
3. Set of casual links:

 Action A achieves effect E for Action B

Fig. 5 Casual links

 Fig 6 shows, if you buy apple it’s effect can be eating


of apple and precondition is cutting apple
 Conflicts: Action C has effect E
 So before eating, we want make decorative apple swan
 For consistency there should not be conflicts

4. Set of open precondition:


 Precondition is open, if cannot achieved by some
action
 For consistency, no open precondition

 Consistent plan:
 Does not have cycle constraints
 No open precondition
 No conflicts
Hierarchical Planning
 Also called as plan decomposition
 Complex actions can be decomposed into more

primitive actions, represented in operator expansion

Move(X,Y,Z)

Pickup(X,Y) Putdown(X,Z)
 To create hierarchical plan to travel from source to
destination
travel(source, dest)

goto(train, Source) buyticket(train) catch(train)

goto(counter) request(ticket) play(ticket)


 POP one level planner
- Consider the e.g of trip, first decide location
- Then open browser and book ticket
- Then do hotel booking
- This is called one level planner

Fig.6. One level planning


 Hierarchy of Actions
- Action depends: major and minor actions
- Minor activities covers major activities
- There can be a complex problem
- For plan ordering we must try large no. of possibilities
- In this, more importance is given to major steps
 Planner
- Identify hierarchy of major conditions
- Construct a plan in levels
- Patch major levels
- Finally demonstrate
 Conditional Planning
- Independent of outcomes
- Properties: fully observable, nondeterministic
- Check what is happening in environments
- E.g vacuum world problem
 Condition 2
Prolog
 Uses idea of logical programming
 Case sensitive language
 Syntax, semantics are different than C,JAVA
 It can built database of facts and built KB of

rules
 Declarative language
 Implementation of PROLOG
 There are set of implementation of prolog
 Different set can have different semantics

and syntax
Basics of Prolog
 Extensions of PROLOG: “.pl” , “.pro”, “.p”
 PERL along with PROLOG: “.pro”
Thank You

You might also like