0% found this document useful (0 votes)
10 views

Ai-Module 3

The document discusses logical agents and their components including knowledge representation and inference. Logical agents use logical formalisms like propositional logic to represent knowledge and perform inference using techniques such as modus ponens and resolution to derive new information and select actions. The document also discusses specific examples like the Wumpus World and how propositional logic can be used to build a simple knowledge base for it.

Uploaded by

edemo6870
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Ai-Module 3

The document discusses logical agents and their components including knowledge representation and inference. Logical agents use logical formalisms like propositional logic to represent knowledge and perform inference using techniques such as modus ponens and resolution to derive new information and select actions. The document also discusses specific examples like the Wumpus World and how propositional logic can be used to build a simple knowledge base for it.

Uploaded by

edemo6870
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 41

MODULE 3

 Logical Agents
• Knowledge-Based Agents

• The Wumpus World

• Logic

• Propositional Logic: A Very Simple Logic

• Propositional Theorem Proving

• Effective Propositional Model Checking

• Agents Based on propositional Logic


Logical Agents
• Knowledge Representation: Logical agents represent knowledge using logical formalisms such
as propositional logic, predicate logic, or first-order logic.

• Inference: Logical agents perform inference by applying logical rules and deduction to draw
conclusions from known facts and information.

• They use mechanisms such as modus ponens, resolution, and backward chaining to derive new
information from existing knowledge.

• In some cases, logical agents may employ search algorithms to explore the space of possible
solutions or states.

• Based on the conclusions drawn from logical inference, logical agents select actions to achieve
their goals or objectives.
Knowledge Based Agents
• The idea is that an agent can represent knowledge of its world, its goals and the
current situation by sentences in logic and decide what to do by inferring that a
certain action or course of action is appropriate to achieve its goals.”
• Intelligent agents need knowledge about the world to choose

• good actions/decisions.

• Knowledge = {sentences} in a knowledge representation language (formal language).

• A sentence is an assertion about the world.

 A knowledge-based agent is composed of:

1. Knowledge base: domain-specific content.

2. Inference mechanism: domain-independent algorithms.


The agent must be able to:
– Represent states, actions, etc.

– Incorporate new percept's

– Update internal representations of the world

– Deduce hidden properties of the world

– Deduce appropriate actions

Declarative approach to building an agent:

– Add new sentences: Tell it what it needs to know

– Query what is known: Ask itself what to do - answers should follow from the KB
When knowledge-based agent runs it:
– Tells KB about its latest perception

• MAKE-PERCEPT-SENTENCE

– Asks the KB what to do next

• MAKE-ACTION-QUERY

– Executes action and tells the KB so

• MAKE-ACTION-SENTENCE
The Wumpus World
• 4 X 4 grid of rooms

• Squares adjacent to Wumpus are smelly and squares adjacent to pit are breezy

• Glitter iff gold is in the same square

• Shooting kills Wumpus if you are facing it

• Wumpus emits a horrible scream when it is killed that can be hear anywhere

• Shooting uses up the only arrow

• Grabbing picks up gold if in same square

• Releasing drops the gold in same square


Wumpus World PEAS
 Performance measure: +1000 for climbing out of the cave with the gold, –1000 for falling
into a pit or being eaten by the wumpus, –1 for each action taken, and –10 for using up the
arrow. The game ends either when the agent dies or when the agent climbs out of the cave.

 Environment:

– 4 X 4 grid of rooms

– Agent starts in square [1,1] facing to the right

– Locations of the gold, and Wumpus are chosen randomly with a uniform distribution from
all squares except [1,1]

– Each square other than the start can be a pit with probability of 0.
 Actuators:
– Left turn, Right turn, Forward, Grab, Release, Shoot

 Sensors:

– In the squares directly (not diagonally) adjacent to the wumpus, the agent will
perceive a Stench. 1

– In the squares directly adjacent to a pit, the agent will perceive a Breeze.

– In the square where the gold is, the agent will perceive a Glitter.

– When an agent walks into a wall, it will perceive a Bump.

– When the wumpus is killed, it emits a woeful Scream that can be perceived anywhere
in the cave.
Logic
• Knowledge bases consist of sentences.

• These sentences are Syntax expressed according to the syntax of the


representation language, which specifies all the sentences that are well formed.

• The notion of syntax is clear enough in ordinary arithmetic: “x+y = 4” is a well-


formed sentence, where as “x4y+ =” is not.

• A logic must also define the semantics, or meaning, of sentences.

• The semantics for arithmetic specifies that the sentence “x+y=4” is true in a
world where x is 2 and y is 2, but false in a world where x is 1 and y is 1.
• The semantics defines the truth of each sentence with respect to each possible world.

• we use the term model in place of “possible world.”

• If a sentence α is true in model m, we say that m satisfies α or sometimes m is a model of α.

• We use the notation M(α) to mean the set of all models of α.

• This involves the relation of logical entailment between sentences—the idea that a sentence
follows logically from another sentence. α |= β to mean that the sentence α entails the
sentence β.

• The formal definition of entailment is this:

• α |= β if and only if, in every model in which α is true, β is also true. Using the notation just

• introduced, we can write


• If an inference algorithm i can derive α from KB, we write which is pronounced “α
is derived from KB by i” or “i derives α from KB.”

• An inference algorithm that derives only entailed sentences is called sound or truth
preserving.

• The property of completeness is also desirable: an inference algorithm is complete if it can


derive any sentence that is entailed.

• he final issue to consider is grounding—the connection between logical reasoning


processes and the real environment in which the agent exists.
Proposition Logic
• Propositional logic (PL) is the simplest logic.

• Syntax of PL: defines the allowable sentences or propositions.

• Definition (Proposition): A proposition is a declarative statement that’s either


True or False.

• Atomic proposition: single proposition symbol. Each symbol is a proposition.


Notation: upper case letters and may contain subscripts.

• Compound proposition: constructed from atomic propositions using


parentheses and logical connectives.
Atomic proposition
Examples of atomic propositions:

• 2+2=4 is a true proposition

• W1,3 is a proposition. It is true if there is a Wumpus in [1,3] “If there is a stench in


[1,2] then there is a Wumpus in [1,3]” is a proposition.

• “How are you?” or “Hello!” are not propositions. In general, statement that are
questions, commands, or opinions are not propositions.

• Complex sentences are constructed from simpler sentences, using parentheses and
operators called logical connectives.
There are five connectives in common use:

• ¬ (not). A sentence such as ¬W1,3 is called the negation of W1,3. A literal is either an atomic sentence
(a positive literal) or a negated atomic sentence (a negative literal).

• ∧ (and). A sentence whose main connective is ∧, such as W1,3 ∧ P3,1, is called a conjunction; its
parts are the conjuncts. (The ∧ looks like an “A” for “And”.)

• ∨ (or). A sentence whose main connective is ∨, such as (W1,3 ∧ P3,1) ∨ W2,2, is a disjunction; its
parts are disjuncts—in this example, (W1,3 ∧P3,1) and W2,2.

• ⇒ (implies). A sentence such as (W1,3 ∧P3,1) ⇒ ¬W2,2 is called an implication (or conditional). Its
premise or antecedent is (W1,3 ∧P3,1), and its conclusion or consequent is ¬W2,2. Implications are
also known as rules or if–then statements. The implication symbol is sometimes written in other books
as ⊃ or →.

• ⇔ (if and only if). The sentence W1,3 ⇔ ¬W2,2 is a biconditional.


Semantics
• The semantics define the rules to determine the truth of a sentence.

• Semantics can be specified by truth tables.

• Boolean values domain: T,F

• n-tuple: (x1, x2, ..., xn)

• Operator on n-tuples : g(x1 = v1, x2 = v2, ..., xn = vn)

• Definition: A truth table defines an operator g on n- tuples by specifying a Boolean


value for each tuple

• Number of rows in a truth table? R = 2n


A Simple Knowledge Base
• we have defined the semantics for propositional logic, we can construct a knowledge base
for the wumpus world.
• It directly implements the definition of entailment.

• It's sound because it adheres to the rules of logical entailment, ensuring that only valid
conclusions are drawn.

• However, the time complexity is exponential in the worst case, specifically O(2^n), where n
is the number of symbols in the knowledge base and α.

• Despite the exponential time complexity, the space complexity remains manageable at O(n)
due to the depth-first nature of the enumeration.

• While the enumeration algorithm provides a straightforward approach to logical inference, it


may not be practical for large knowledge bases.

• Efficient algorithms exist to mitigate the exponential complexity, offering better performance
in many cases.
Propositional Theorem Proving
• The first concept is logical equivalence: two sentences α and β are logically equivalent if they are true
in the same set of models.

• An alternative definition of equivalence is as follows: any two sentences α and β are equivalent if and
only if each of them entails the other:

α ≡ β if and only if α |= β and β |= α

• The second concept we will need is validity. A sentence is valid if it is true in all models. For example,
the sentence P∨ ¬P is valid.

• Valid sentences are also known as tautologies—they are necessarily true. Because the sentence True is
true in all models, every valid sentence is logically equivalent to True.

• What good are valid sentences? From our definition of entailment, we can derive the deduction
theorem, which was known to the ancient Greeks:
• Validity and satisfiability are of course connected: α is valid if ¬α is unsatisfiable; contra
positively, α is satisfiable if ¬α is not valid. We also have the following useful result: α |
= β if and only if the sentence (α∧ ¬β) is unsatisfiable.
Inference and Proofs
• This section covers inference rules that can be applied to derive a proof—a chain of
conclusions that leads to the desired goal. The best-known rule is called Modus Ponens
(Latin for mode that affirms) and is written
• The notation means that, whenever any sentences of the form α ⇒ β and α are given,
then the sentence β can be inferred. For example, if (WumpusAhead ∧ WumpusAlive)
⇒ Shoot and (WumpusAhead∧ WumpusAlive) are given, then Shoot can be inferred.
Another useful inference rule is And-Elimination, which says that, from a
conjunctionany of the conjuncts can be inferred:
• All of the logical equivalences in Figure 7.11 can be used as inference rules. For
example, the equivalence for biconditional elimination yields the two inference rules
Proof by Resolution
• If we removed the biconditional elimination rule, the proof in the preceding section would
not go through. The current section introduces a single inference rule, resolution, that
yields a complete inference algorithm when coupled with any complete search algorithm.
Conjunctive normal form
• A sentence expressed as a conjunction of clauses is said to be in conjunctive normal form
or CNF (see Figure 7.12). We now describe a procedure for converting to CNF. We
illustrate the procedure by converting the sentence B1,1 ⇔ (P1,2 ∨P2,1) into CNF. The
steps are as follows:
A resolution algorithm
• The empty clause—a disjunction of no disjuncts—is equivalent to False
because a disjunction is true only if at least one of its disjuncts is true.
Moreover, the empty clause arises only from resolving two contradictory unit
clauses such as P and ¬P.

• We can apply the resolution procedure to a very simple inference in the


wumpus world. When the agent is in [1,1], there is no breeze, so there can be
no pits in neighbouring squares. The relevant knowledge base is
Forward and backward chaining
• Horn Form (restricted)
-KB = conjunction of Horn clauses
• Horn clause =
– proposition symbol; or
– (conjunction of symbols) ⇒ symbol

• Can be used with forward chaining or backward chaining


• These algorithms are very natural and run in linear time

You might also like