0% found this document useful (0 votes)
21 views93 pages

Knowledge Representation

Knowledge representation (KR) is a core concept in AI that involves encoding information for computers to understand and reason about the world. It encompasses various techniques and components, such as objects, attributes, relationships, and rules, enabling AI systems to solve complex problems and learn from data. KR is essential for applications like expert systems, natural language processing, and robotics, facilitating informed decision-making and adaptability in intelligent systems.

Uploaded by

Sri Varshini
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views93 pages

Knowledge Representation

Knowledge representation (KR) is a core concept in AI that involves encoding information for computers to understand and reason about the world. It encompasses various techniques and components, such as objects, attributes, relationships, and rules, enabling AI systems to solve complex problems and learn from data. KR is essential for applications like expert systems, natural language processing, and robotics, facilitating informed decision-making and adaptability in intelligent systems.

Uploaded by

Sri Varshini
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 93

Knowledge Representation

References:
1) Introduction to Artificial Intelligence & Expert
Systems, Dan W Patterson, PHI.,2010
2) S Kaushik, Artificial Intelligence, Cengage Learning,
1st ed.2011
Definition
• Knowledge representation (KR) is a fundamental
concept in Artificial Intelligence (AI) concerned
with how to encode information and knowledge
about the world in a way that computers can
understand and reason about.
• It's essentially the bridge between the vast
amount of data available and the ability of AI
systems to use that data for intelligent tasks.
Key aspects of knowledge representation:
• Goals:
– Enable AI systems to solve complex problems.
– Facilitate reasoning and inference based on the knowledge.
– Allow for learning and knowledge acquisition from data or
experience.
• Key Considerations:
– What knowledge needs to be represented?
(facts, objects, events, relationships)
– How to structure the knowledge?
(logic, rules, frames, semantic networks)
– How to reason with the knowledge?
(deduction, induction, analogy)
• Key Components:
– Objects: Represent entities in the world (e.g., cars, houses, people).
– Attributes: Describe properties of objects (e.g., color, size, location).
– Relationships: Specify connections between objects (e.g., ownership,
part-of).
– Rules: Encode logical statements about the world (e.g., "birds can
fly").
• Common KR Techniques:
– Logic: Formal languages like propositional logic and predicate logic for
representing facts and rules.
– Semantic Networks: Nodes represent objects, and links represent
relationships between them.
– Frames: Structured representations of objects with attributes and
values.
– Production Rules: "If-then" rules that capture cause-and-effect
relationships.
Why is KR important?
• KR is crucial for building intelligent systems that can learn, reason,
and solve problems in the real world. By representing knowledge
effectively, AI systems can:
– Understand complex situations.
– Make informed decisions.
– Adapt to new information and environments.
– Explain their reasoning process (important for transparent AI).

Real-world applications of KR:


• Expert Systems: Capture and reason with knowledge from human
experts in a specific domain (e.g., medical diagnosis).
• Natural Language Processing: Understand the meaning of text by
representing knowledge about language structure and semantics.
• Robotics: Enable robots to plan actions, navigate their environment,
and interact with objects.
• A variety of ways of representing knowledge
(facts) have been exploited in AI programs.
• In all variety of knowledge representations ,
we deal with two kinds of entities.
– Facts: Truths in some relevant world. These are
the things we want to represent.
– Representations of facts in some chosen
formalism . these are things we will actually be
able to manipulate.
• One way to think of structuring these entities is at
two levels :
– (a) the knowledge level, at which facts are described, and
– (b) the symbol level, at which representations of objects
at the knowledge level are defined in terms of symbols
that can be manipulated by programs.
• The facts and representations are linked with two-
way mappings.
• This link is called representation mappings.
– The forward representation mapping maps from facts to
representations.
– The backward representation mapping goes the other
way, from representations to facts.
• One common representation is natural
language (particularly English) sentences.
• Regardless of the representation for facts we
use in a program , we may also need to be
concerned with an English representation of
those facts in order to facilitate getting
information into and out of the system.
• We need mapping functions from English
sentences to the representation we actually
use and from it back to sentences.
Representations and Mappings
• In order to solve complex problems encountered in artificial
intelligence, one needs both a large amount of knowledge
and some mechanism for manipulating that knowledge to
create solutions.
• Knowledge and Representation are two distinct entities.
They play central but distinguishable roles in the intelligent
system.
• Knowledge is a description of the world. It determines a
system’s competence by what it knows.
• Moreover, Representation is the way knowledge is encoded.
It defines a system’s performance in doing something.
• Different types of knowledge require different kinds of
representation.
• The Knowledge Representation
models/mechanisms are often based on:
– Logic
– Rules
– Frames
– Semantic Nets
• Knowledge is categorized into two major types:
1. Tacit corresponds to “informal” or “implicit“
– Exists within a human being;
– It is embodied.
– Difficult to articulate formally.
– Difficult to communicate or share.
– Moreover, Hard to steal or copy.
– Drawn from experience, action, subjective insight
2. Explicit formal type of knowledge, Explicit
– Explicit knowledge
– Exists outside a human being;
– It is embedded.
– Can be articulated formally.
– Also, Can be shared, copied, processed and stored.
– So, Easy to steal or copy
– Drawn from the artifact of some type as a principle, procedure,
process, concepts
• A variety of ways of representing knowledge have
been exploited in AI programs.
• There are two different kinds of entities, we are
dealing with.
– 1. Facts: Truth in some relevant world. Things we want to
represent.
– 2. Also, Representation of facts in some chosen formalism.
Things we will actually be able to manipulate.
• These entities structured at two levels:
– 1. The knowledge level, at which facts described.
– 2. Moreover, The symbol level, at which representation of
objects defined in terms of symbols that can manipulate by
programs
Framework of Knowledge Representation
• The computer requires a well-defined problem description to
process and provide a well defined acceptable solution.
• Moreover, To collect fragments of knowledge we need first to
formulate a description in our spoken language and then represent
it in formal language so that computer can understand.
• Also, The computer can then use an algorithm to compute an
answer.

-
• The steps are:
– The informal formalism of the problem takes place
first.
– It then represented formally and the computer
produces an output.
– This output can then represented in an informally
described solution that user understands or checks for
consistency.
• The Problem solving requires,
– Formal knowledge representation, and
– Moreover, Conversion of informal knowledge to a
formal knowledge that is the conversion of implicit
knowledge to explicit knowledge
Mapping between Facts and Representation

• Knowledge is a collection of facts from some domain.


• Also, We need a representation of “facts“ that can
manipulate by a program.
• Moreover, Normal English is insufficient, too hard
currently for a computer program to draw inferences in
natural languages.
• Thus some symbolic representation is necessary for
mapping.
A good knowledge representation enables fast and accurate
access to knowledge and understanding of the content.
A knowledge representation system should
have following properties.
• Representational Adequacy
– The ability to represent all kinds of knowledge that are needed in that
domain.
• Inferential Adequacy
– Also, The ability to manipulate the representational structures to derive
new structures corresponding to new knowledge inferred from old.
• Inferential Efficiency
– The ability to incorporate additional information into the knowledge
structure that can be used to focus the attention of the inference
mechanisms in the most promising direction.
• Acquisitional Efficiency -reinforcement
– Moreover, The ability to acquire new knowledge using automatic
methods wherever possible rather than reliance on human intervention.
Knowledge Representation Schemes
• Relational Knowledge
• Inheritable Knowledge
• Inferential Knowledge
• Procedural Knowledge
Relational Knowledge

• The simplest way to represent declarative facts is a set of relations of the same sort
used in the database system.
• Provides a framework to compare two objects based on equivalent attributes. o Any
instance in which two different objects are compared is a relational type of knowledge.
• The table below shows a simple way to store facts.
– Also, The facts about a set of objects are put systematically in columns.
– This representation provides little opportunity for inference
• Given the facts, it is not possible to answer a simple question such as: “Who is the
heaviest player?”
• Also, But if a procedure for finding the heaviest player is provided, then these facts will
enable that procedure to compute an answer.
• Moreover, We can ask things like who “bats – left” and “throws – right”.
• Each fact about a set of objects is set out systematically in columns
• This representation gives little opportunity for inference, but it can
be used as the knowledge basis for inference engines.
• Simple way to store facts.
• Little opportunity for inference.
• Knowledge basis for inference engines.

• We can ask things like:


• Who is dead?
• Who plays Jazz/Trumpet etc.?
• This sort of representation is popular in database systems.
Inheritable Knowledge

• Here the knowledge elements inherit attributes from their


parents.
• In this approach, all data must be stored in a hierarchy of
classes.
• Boxed nodes are used to represent objects and their values.
• We use Arrows that point from objects to their values.
• Rather than starting from scratch, an AI system can inherit
knowledge from other systems, allowing it to learn faster and
avoid repeating mistakes that have already been made.
Inheritable knowledge also allows for knowledge transfer across
domains, allowing an AI system to apply knowledge learned in
one domain to another
• Here inference is faster than relational knowledge.
• So, The classes organized in a generalized hierarchy.
• Boxed nodes — objects and values of attributes of objects.
• Arrows — the point from object to its value.
• This structure is known as a slot and filler structure,
semantic network or a collection of frames.
Slot and Filler Structure, Semantic Network
or Collection of Frames
Inferential Knowledge

• This knowledge generates new information from the given information.


• This is the precise way of using formal logic to guarantee accurate facts and decisions.
For instance, you can use this to deduce that if “All men are mortal” and “Socrates is a
man,” then “Socrates is mortal.”.
• Key Concepts in Formal Logic for Inferential Knowledge:
• Propositional Logic (Sentential Logic):
• Deals with simple statements (propositions) that are either true or false.
• Inferences are made based on logical connectives like AND, OR, NOT, IMPLIES, etc.
• Predicate Logic (First-order Logic):
• Extends propositional logic by including predicates, variables, and quantifiers (like ∀ for
"for all" and ∃ for "there exists").
• Allows for more expressive statements that can describe relationships between objects
and properties of objects.
• Modal Logic:
• Deals with necessity and possibility, often used for reasoning about knowledge, beliefs,
time, or obligations.
• Inference Rules:
• Rules like Modus Ponens, Modus Tollens are used to derive conclusions from premises.
1. Propositional Logic Example:
Knowledge Base:
– P→Q (If it rains, then the ground will be wet)
– P (It rains)
Inference:
– From P→Q (If it rains, then the ground will be wet) and P (It rains), we can infer Q
(The ground is wet) using Modus Ponens.
Conclusion: The ground is wet.
– Rule Used: Modus Ponens (If P→Q and P are true, then Q must be true).
2. Predicate Logic Example:
Knowledge Base:
– ∀x(Human(x)→Mortal(x)) (All humans are mortal)
Human(Socrates)
Inference:
– From the universal statement ∀x(Human(x)→Mortal(x)), we know that if something
is a human, it is mortal.
– Since Human(Socrates) is true, we can infer Mortal(Socrates) (Socrates is mortal).
Conclusion: Socrates is mortal.
– Rule Used: Universal Instantiation (From ∀x(P(x)) , we can infer P(a) for any specific
a).
Reasoning about Knowledge (Modal Logic Example):
Hypothetical Syllogism Example (Propositional Logic):
Knowledge Base:
– P→Q (If I study, I will pass the exam)
– Q→R (If I pass the exam, I will graduate)
Inference:
– By applying Hypothetical Syllogism, we can infer:
– P→R (If I study, I will graduate).
Conclusion: If I study, I will graduate.
– Rule Used: Hypothetical Syllogism (If P→Q and Q→R then P→R
Modus Tollens Example (Propositional Logic):
Knowledge Base:
– P→Q (If the car is running, then the engine is on)
– ¬Q (The engine is not on)
Inference:
– From P→Q (If the car is running, then the engine is on)
– and ¬Q (The engine is not on), we can infer ¬P (The car is not running) using Modus
Tollens.
Conclusion: The car is not running.
Rule Used: Modus Tollens (If P→Q and ¬Q, then ¬P).
• Advantages:
– A set of strict rules.
– Can use to derive more facts.
– Also, Truths of new statements can be verified.
– Guaranteed correctness.
• So, Many inference procedures available to
implement standard rules of logic popular in
AI systems. e.g Automated theorem proving
Procedural Knowledge

• A representation in which the control information, to use the knowledge,


embedded in the knowledge itself. For example, computer programs,
directions, and recipes; these indicate specific use or implementation;
• Moreover, Knowledge encoded in some procedures, small programs that
know how to do specific things, how to proceed.
• Advantages:
– Heuristic or domain-specific knowledge can represent.
– Moreover, Extended logical inferences, such as default reasoning facilitated.
– Also, Side effects of actions may model. Some rules may become false in time.
Keeping track of this in large systems may be tricky.
• Disadvantages:
– Completeness — not all cases may represent.
– Consistency — not all deductions may be correct. e.g If we know that Fred is a
bird we might deduce that Fred can fly. Later we might discover that Fred is an
emu.
– Modularity sacrificed. Changes in knowledge base might have far-reaching effects.
– Cumbersome control information.
USING PREDICATE LOGIC
• Representation of Simple Facts in Logic - Propositional
logic is useful because it is simple to deal with and a
decision procedure for it exists.
• Also, In order to draw conclusions, facts are
represented in a more convenient way as,
1. Marcus is a man.
– man(Marcus)
2. Plato is a man.
– man(Plato)
3. All men are mortal.
– mortal(men)
• But propositional logic fails to capture the
relationship between an individual being a
man and that individual being a mortal.
– How can these sentences be represented so that
we can infer the third sentence from the first two?
– Also, Propositional logic commits only to the
existence of facts that may or may not be the case
in the world being represented.
– Moreover, It has a simple syntax and simple
semantics. It suffices to illustrate the process of
inference.
– Propositional logic quickly becomes impractical,
even for very small worlds.
Predicate logic
• First-order Predicate logic (FOPL) models the world in terms of
– Objects, which are things with individual identities
– Properties of objects that distinguish them from other objects
– Relations that hold among sets of objects
– Functions, which are a subset of relations where there is only one “value” for
any given “input”
• First-order Predicate logic (FOPL) provides
– Constants: a, b, dog33. Name a specific object.
– Variables: X, Y. Refer to an object without naming it.
– Functions: Mapping from objects to objects.
– Terms: Refer to objects
– Atomic Sentences: in(dad-of(X), food6) Can be true or false, Correspond to
propositional symbols P, Q.
• A well-formed formula (wff) is a sentence containing no “free” variables.
So, That is, all variables are “bound” by universal or existential quantifiers.
(∀x)P(x, y) has x bound as a universally quantified variable, but y is free.
Quantifiers
• Universal quantification
– (∀x)P(x) means that P holds for all values of x in the domain associated with that
variable (p(x) is true for all values of x in the domain)
– E.g., (∀x) dolphin(x) → mammal(x)
• Existential quantification
– (∃ x)P(x) means that P holds for some value of x in the domain associated with that
variable [p(x) is true there exists some values of x in the domain]
– E.g., (∃ x) mammal(x) ∧ lays-eggs(x)
• Also, Consider the following example that shows the use of predicate logic as a
way of representing knowledge.
– 1. Marcus was a man.
– 2. Marcus was a Pompeian.
– 3. All Pompeians were Romans.
– 4. Caesar was a ruler.
– 5. Also, All Pompeians were either loyal to Caesar or hated him.
– 6. Everyone is loyal to someone.
– 7. People only try to assassinate rulers they are not loyal to.
– 8. Marcus tried to assassinate Caesar.
• The facts described by these sentences can be represented as a set of well-
formed formulas (wffs) as follows:
1. Marcus was a man.
– man(Marcus)
2. Marcus was a Pompeian.
– Pompeian(Marcus)
3. All Pompeians were Romans.
– ∀x: Pompeian(x) → Roman(x)
4. Caesar was a ruler.
– ruler(Caesar)
5. All Pompeians were either loyal to Caesar or hated him.
• inclusive-or ∀x: Roman(x) → loyalto(x, Caesar) ∨ hate(x, Caesar)
• exclusive-or
– ∀x: Roman(x) → (loyalto(x, Caesar) ∧¬ hate(x, Caesar)) ∨
(¬loyalto(x, Caesar) ∧ hate(x, Caesar))
6. Everyone is loyal to someone.
– ∀x: ∃y: loyalto(x, y)
7. People only try to assassinate rulers they are not loyal to.
– ∀x: ∀y: person(x) ∧ ruler(y) ∧ tryassassinate(x, y) →¬loyalto(x, y)
8. Marcus tried to assassinate Caesar.
• Now suppose if we want to use these statements to answer the question: Was Marcus
loyal to Caesar?
• Also, Now let’s try to produce a formal proof, reasoning backward from the desired goal: ¬
Ioyalto(Marcus, Caesar)
• In order to prove the goal, we need to use the rules of inference to transform it into
another goal (or possibly a set of goals) that can, in turn, transformed, and so on, until
there are no unsatisfied goals remaining.
• The problem is that, although we know that Marcus was a man, we do not have any way
to conclude from that that Marcus was a person.
• Also, We need to add the representation of another fact to our system, namely:
∀(x): man(x) → person(x)
• Now we can satisfy the last goal and produce a proof that Marcus was not loyal to Caesar
• Moreover, From this simple example, we see that three
important issues must be addressed in the process of
converting English sentences into logical statements and then
using those statements to deduce new ones:
1. Many English sentences are ambiguous (for example, 5, 6,
and 7 above). Choosing the correct interpretation may be
difficult.
2. Also, There is often a choice of how to represent the
knowledge. Simple representations are desirable, but they
may exclude certain kinds of reasoning.
3. Similalry, Even in very simple situations, a set of sentences is
unlikely to contain all the information necessary to reason
about the topic at hand. In order to be able to use a set of
statements effectively. Moreover, It is usually necessary to
have access to another set of statements that represent facts
that people consider too obvious to mention.
Representing Instance and ISA Relationships
• Specific attributes instance and isa play an important role
particularly in a useful form of reasoning called property
inheritance.
• The predicates instance and isa explicitly captured the relationships
they used to express, namely class membership and class inclusion.
• Figure in next slide shows the first five sentences of the last section
represented in logic in three different ways.
• The first part of the figure contains the representations we have
already discussed. In these representations, class membership
represented with unary predicates (such as Roman), each of which
corresponds to a class.
• The second part of the figure contains representations that use the
instance predicate explicitly. Here class membership represented
with binary predicates.
Instance is Unary relationship
IS-A is binary relationship
• The predicate instance is a binary one, whose first
argument is an object and whose second argument
is a class to which the object belongs.
• But these representations do not use an explicit isa
predicate.
• Instead, subclass relationships, such as that
between Pompeians and Romans, described as
shown in sentence 3.
• The implication rule states that if an object is an
instance of the subclass Pompeian then it is an
instance of the superclass Roman.
• Note that this rule is equivalent to the standard
set-theoretic definition of the subclasssuperclass
relationship.
• The third part contains representations that use
both the instance and isa predicates explicitly.
• The use of the isa predicate simplifies the
representation of sentence 3, but it requires that
one additional axiom (shown here as number 6) be
provided.
• Example 2: ::::: Unary predicate representation
• John is a doctor.
doctor(John)
• John lives in New York.
lives(John,NewYork)
• All doctors treat patients.
∀x:doctor(x)→treatsPatients(x)
• Hospitals employ only doctors or nurses.
∀x:employedBy(x,hospital)→doctor(x)∨nurse(x)
• Sarah is a nurse.
nurse(Sarah)
• Everyone who lives in New York works in some place.
∀x:lives(x,NewYork)→∃y:worksAt(x,y)
• If someone treats patients, they are considered essential workers.
∀x:treatsPatients(x)→essentialWorker(x)
• John is employed by CityHospital.
employedBy(John,CityHospital)
Instance representation
it focuses on binary representation and specific instance for each Wff rather than keeping it
generalized.

• Instance representation of above


• instance(John,doctor)
• instance(John,livesInNewYork)
• ∀x:instance(x,doctor)→treatsPatients(x)
• ∀x:instance(x,employedByHospital)→instance(x,doctor) ∨
instance(x,nurse)
• instance(Sarah,nurse)
• ∀x:instance(x,livesInNewYork)→∃y:worksAt(x,y)
• ∀x:treatsPatients(x)→instance(x,essentialWorker)
• employedBy(John,CityHospital)
• Does John is considered as an essential worker? [8,4,3 and 7
IS-A representation
• John was a doctor.
instance(John, doctor)
• John lived in New York.
instance(John, livesInNewYork)
• All doctors treat patients.
is-a(doctor, treatsPatients)
• All hospital employees were either doctors or nurses.
is-a(employedByHospital, doctor) ∨ is-a(employedByHospital, nurse)
• Sarah was a nurse.
instance(Sarah, nurse)
• All people who lived in New York worked somewhere.
∀x: is-a(x, livesInNewYork) → ∃y: worksAt(x, y)
• All people who treated patients were essential workers.
is-a(treatsPatients, essentialWorker)
• John was employed by City Hospital.
employedBy(John, CityHospital)
Differences among Predicate, instance and
IS-A representation
Unary Predicate Representation
• Represents properties or categories using a single-argument predicate.
• doctor(John) (John is a doctor)
• Simple and direct for class membership
• Less expressive for relationships between entities
Binary Instance Representation
• Represents relationships between an instance and a class using a binary predicate instance(x, Class).
• instance(John, doctor)
• Clearly defines class-instance relationships
• Cannot express hierarchy between classes
IS-A Representation
• Represents hierarchical relationships between concepts (subclass relationships).
• is-a(doctor, essentialWorker) (Doctors are essential workers)
• Expresses generalization and inheritance
• For complex AI systems, IS-A representation is generally the most useful, as it allows inheritance and
structured reasoning, making it ideal for ontologies and knowledge bases.
NOTE: Ontologies is a formal representation of knowledge as a structured set of concepts,
their attributes, relationships, and the rules governing these relationships within a
specific domain.
Computable Functions and Predicates
• To express simple facts, such as the following
greater-than and less-than relationships: gt(1,O)
It(0,1) gt(2,1) It(1,2) gt(3,2) It( 2,3)
• It is often also useful to have computable
functions as well as computable predicates. Thus
we might want to be able to evaluate the truth of
gt(2 + 3,1)
• To do so requires that we first compute the value
of the plus function given the arguments 2 and 3,
and then send the arguments 5 and 1 to gt.
Consider the following set of facts, again involving Marcus:

1) Marcus was a man. man(Marcus)


2) Marcus was a Pompeian. Pompeian(Marcus)
3) Marcus was born in 40 A.D. born(Marcus, 40)
4) All men are mortal. x: man(x) → mortal(x)
5) All Pompeians died when the volcano erupted in 79 A.D.
- erupted(volcano, 79) ∧ ∀ x : [Pompeian(x) → died(x, 79)]
6) No mortal lives longer than 150 years.
- x: t1: At2: mortal(x) born(x, t1) gt(t2 – t1,150) → died(x, t2)
7) It is now 2025. now = 2025
Idea: https://www.youtube.com/shorts/kanFHRBCZUQ?
feature=share
• Above example shows how these ideas of computable functions and
predicates can be useful. It also makes use of the notion of equality and
allows equal objects to be substituted for each other whenever it appears
helpful to do so during a proof.
• So, Now suppose we want to answer the question “Is Marcus alive?”
– The statements suggested here, there may be two ways of deducing an answer.
– Either we can show that Marcus is dead because he was killed by the volcano or we
can show that he must be dead because he would otherwise be more than 150
years old, which we know is not possible.
– Also, As soon as we attempt to follow either of those paths rigorously, however, we
discover, just as we did in the last example, that we need some additional
knowledge. For example, our statements talk about dying, but they say nothing that
relates to being alive, which is what the question is asking.
• So we add the following facts:
8) Alive means not dead.
– x: t: [alive(x, t) → ¬ dead(x, t)] [¬ dead(x, t) → alive(x, t)]
9) If someone dies, then he is dead at all later times.
– x: t1: At2: died(x, t1) gt(t2, t1) → dead(x, t2)
• So, Now let’s attempt to answer the question “Is Marcus alive?” by proving:
¬ alive(Marcus, now)
• ANS: 3,6 &7 || 2,5
Resolution
Resolution is a powerful inference rule used in propositional logic for automated theorem proving. It essentially allows
you to derive new logical statements based on existing ones. Here's a breakdown of the concept:
• The Idea Behind Resolution:
Imagine you have two statements:
• p OR q (This means either p is true or q is true, or possibly both)
• NOT p OR r (This means either p is false or r is true)
• Looking at these statements together, we can see that if p is true, then statement 2 forces r to be true. On the other
hand, if p is false, then statement 1 guarantees q to be true. Since p has to be either true or false, we can logically
conclude that q OR r must be true. This is the essence of the resolution rule.
Resolution in Action:
• Resolution works by manipulating propositions in a specific form called Conjunctive Normal Form (CNF). Any
propositional logic statement can be converted into CNF, which is a combination of clauses (OR of literals) connected
by AND. The resolution rule then identifies complementary literals (a literal and its negation) in different clauses.
These complementary literals are essentially "canceled out," resulting in a new clause that combines the remaining
information.
• By systematically applying the resolution rule to pairs of clauses from your knowledge base and the negation of the
statement you want to prove (treated as another clause), you can derive new clauses. If you end up deriving a clause
containing only one literal and its negation (always false), it indicates that the original statements lead to a
contradiction. This means the original statements cannot all be true together, making the desired statement true (by
proof by contradiction).
Key Points about Resolution:
• Resolution is sound and complete for propositional logic. This means any valid logical consequence of a set of
statements can be derived using resolution, and conversely, anything derived using resolution is a valid
consequence.
• Resolution forms the basis for automated theorem provers, which can efficiently check if a formula is satisfiable (has
a truth assignment that makes it true) or derive logical consequences.
• Resolution is a theorem proving technique that proceeds
by building refutation proofs, i.e., proofs by
contradictions. It was invented by a Mathematician John
Alan Robinson in the year 1965.
• Resolution is used, if there are various statements are
given, and we need to prove a conclusion of those
statements. Unification is a key concept in proofs by
resolutions. Resolution is a single inference rule which can
efficiently operate on the conjunctive normal form or
clausal form.
• Clause: Disjunction of literals (an atomic sentence) is
called a clause. It is also known as a unit clause.
• Conjunctive Normal Form: A sentence represented as a
conjunction of clauses is said to be conjunctive normal
form or CNF.
Steps for Resolution:
• Conversion of facts into first-order logic.
• Convert FOL statements into CNF
• Negate the statement which needs to prove (proof by
contradiction)
• Draw resolution graph (unification).
Example:
• John likes all kind of food.
• Apple and vegetable are food
• Anything anyone eats and not killed is food.
• Anil eats peanuts and still alive
• Harry eats everything that Anil eats.

Prove by resolution that:


• John likes peanuts.
• Step-1: Conversion of Facts into FOL
• In the first step we will convert all the given
statements into its first order logic.
• Step-2: Conversion of FOL into CNF
• In First order logic resolution, it is required to
convert the FOL into CNF as CNF form makes
easier for resolution proofs.
• Eliminate existential instantiation quantifier
by elimination.
In this step, we will eliminate existential
quantifier ∃, and this process is known
as Skolemization. But in this example problem
since there is no existential quantifier so all
the statements will remain same in this step.
Explanation of Resolution graph:
• In the first step of resolution graph, ¬likes(John, Peanuts) ,
and likes(John, x) get resolved(canceled) by substitution
of {Peanuts/x}, and we are left with ¬ food(Peanuts)
• In the second step of the resolution graph, ¬ food(Peanuts) ,
and food(z) get resolved (canceled) by substitution of { Peanuts/z},
and we are left with ¬ eats(y, Peanuts) V killed(y) .
• In the third step of the resolution graph, ¬ eats(y, Peanuts) and eats
(Anil, Peanuts) get resolved by substitution {Anil/y}, and we are left
with Killed(Anil) .
• In the fourth step of the resolution graph, Killed(Anil) and ¬
killed(k) get resolve by substitution {Anil/k}, and we are left with ¬
alive(Anil) .
• In the last step of the resolution graph ¬alive(Anil) and alive(Anil) get
resolved.
Resolution: Propositional Resolution
• 1. Convert all the propositions of F to clause form.
• 2. Negate P and convert the result to clause form. Add it to
the set of clauses obtained in step 1.
• 3. Repeat until either a contradiction is found or no
progress can be made:
– 1. Select two clauses. Call these the parent clauses.
– 2. Resolve them together. The resulting clause, called the
resolvent, will be the disjunction of all of the literals of both of
the parent clauses with the following exception: If there are any
pairs of literals L and ¬ L such that one of the parent clauses
contains L and the other contains ¬L, then select one such pair
and eliminate both L and ¬ L from the resolvent.
– 3. If the resolvent is the empty clause, then a contradiction has
been found. If it is not, then add it to the set of classes available
to the procedure.
The Unification Algorithm
• In propositional logic, it is easy to determine that two literals
cannot both be true at the same time.
• Simply look for L and ¬L in predicate logic, this matching
process is more complicated since the arguments of the
predicates must be considered.
• For example, man(John) and ¬man(John) is a contradiction,
while the man(John) and ¬man(Spot) is not.
• Thus, in order to determine contradictions, we need a
matching procedure that compares two literals and discovers
whether there exists a set of substitutions that makes them
identical.
• There is a straightforward recursive procedure, called the
unification algorithm, that does it.
Unification Examples
1. UNIFY (Knows(John,x),Knows(John,Jane))
• Predicates are same here (Knows)
• Θ={x/Jane} this is MGU( Most General unifier)
2. UNIFY (Knows(John,x),Knows(y,Bill))
Θ={x/Bill, y/John} –MGU
3. UNIFY (Knows(John,x),Knows(y,Mother(y)))
Θ={ y/John,x/Mother(John)} –MGU
4. UNIFY (Knows(John,x),Knows(x,Jane))
Θ = FAIL (variables should not be same) , change x to y in
Knows(John,x),Knows(y,Jane)
This is called “standardizing”
Resolution in Predicate Logic
• We can now state the resolution algorithm for
predicate logic as follows, assuming a set of
given statements F and a statement to be
proved P:
Resolution Procedure
• Resolution is a procedure, which gains its efficiency from the fact that it operates on statements that
have been converted to a very convenient standard form.
• Resolution produces proofs by refutation.
• In other words, to prove a statement (i.e., to show that it is valid), resolution attempts to show that the
negation of the statement produces a contradiction with the known statements (i.e., that it is
unsatisfiable).
• The resolution procedure is a simple iterative process: at each step, two clauses, called the parent
clauses, are compared (resolved), resulting in a new clause that has inferred from them. The new clause
represents ways that the two parent clauses interact with each other.
• Suppose that there are two clauses in the system:
– winter V summer
– ¬ winter V cold
• Now we observe that precisely one of winter and ¬ winter will be true at any point.
• If winter is true, then cold must be true to guarantee the truth of the second clause. If ¬ winter is true,
then summer must be true to guarantee the truth of the first clause.
• Thus we see that from these two clauses we can deduce summer V cold
• This is the deduction that the resolution procedure will make.
• Resolution operates by taking two clauses that each contains the same literal, in this example, winter.
• Moreover, The literal must occur in the positive form in one clause and in negative form in the other.
The resolvent obtained by combining all of the literals of the two parent clauses except the ones that
cancel.
• If the clause that produced is the empty clause, then a contradiction has found. For example, the two
clauses winter ¬ winter will produce the empty clause.
Natural Deduction Using Rules
• Natural deduction uses a formal language with symbols representing
propositions (statements) and logical connectives (AND, OR, NOT, etc.).
• A set of inference rules dictates how to manipulate these symbols to
arrive at valid conclusions based on given premises.
• Testing whether a proposition is a tautology by testing every possible
truth assignment is expensive—there are exponentially many.
• We need a deductive system, which will allow us to construct proofs of
tautologies in a step-by-step fashion.
• This system is known as natural deduction.
• The system consists of a set of rules of inference for deriving
consequences from premises.
• One builds a proof tree whose root is the proposition to be proved and
whose leaves are the initial assumptions or axioms (for proof trees, we
usually draw the root at the bottom and the leaves at the top)
• For example, one rule of our system is known as
modus ponens. Intuitively, this says that if we know
P is true, and we know that P implies Q, then we can
conclude Q.

• The propositions above the line are called premises;


the proposition below the line is the conclusion.
• Both the premises and the conclusion may contain
metavariables (in this case, P and Q) representing
arbitrary propositions.
• Most rules come in one of two flavors:
introduction or elimination rules.
• Introduction rules introduce the use of a
logical operator, and elimination rules
eliminate it.
• Modus ponens is an elimination rule for ⇒.
• On the right-hand side of a rule, we often
write the name of the rule. This is helpful
when reading proofs.
Rules
• Vcb

• cx
Proofs
• A proof of proposition P in natural deduction
starts from axioms and assumptions and
derives P with all assumptions discharged.
• Every step in the proof is an instance of an
inference rule with metavariables substituted
consistently with expressions of the
appropriate syntactic class.
Example
Forward and Backward Reasoning
• in AI planning, forward chaining and backward chaining are strategies used in
rule-based systems.
• Forward Representation
• Definition: Starting from the initial state and progressing toward the goal by
applying actions or rules step-by-step.
• Ex: Autonomous Vehicle Navigation
A self-driving car uses forward representation to plan routes. It begins with real-
time sensor data (current location, obstacles, traffic) and incrementally
simulates possible paths forward to reach the destination.
• Each decision (e.g., turning left, accelerating) is based on the immediate state,
updating dynamically as new data arrives.
• Recommendation Systems: Analyzing user behavior (initial data) to predict and
suggest products (goal).
• Manufacturing Robots: Assembling parts sequentially from raw materials to a
finished product.
• Backward Reasoning
– Starting from the goal state and working backward to determine the necessary
steps or conditions to achieve it.
• Ex: Medical Diagnosis Systems – expetr system
An AI diagnostic tool (e.g., IBM Watson Health) uses backward
representation by hypothesizing a disease (goal) and validating it against
symptoms, lab results, and patient history.
• Ex: if the system considers "pneumonia" as a possible diagnosis, it checks
backward for supporting evidence like fever, cough, or X-ray findings
• Ex: Chess Engines
Advanced chess AIs use retrograde analysis for endgame scenarios. They
precompute optimal moves by starting from checkmate positions (goal)
and working backward to evaluate current move sequences.
• This backward approach ensures efficient calculation of winning strategies.
• Theorem Provers: Breaking down complex theorems into sub-goals and
axioms.
• Customer Support Chatbots: Identifying a user’s intent (goal) and
backtracking to gather required information.
• Eg:
• X,y=1,2
• If x==1 & y==3
– Then z=3
• If z==3
– Class is over
• There are four factors influencing the type of reasoning. They
are:
1. Are there more possible start or goal state? We
move from smaller set of sets to the length.
2. In what direction is the branching factor greater?
We proceed in the direction with the lower
branching factor.
3. Will the program be asked to justify its reasoning
process to a user? If, so then it is selected since it
is very close to the way in which the user thinks.
4. What kind of event is going to trigger a problem-
solving episode? If it is the arrival of a new factor, the
forward reasoning makes sense. If it is a query to which
a response is desired, backward reasoning is more
natural.
Examples for Forward Reasoning:
• Autonomous Navigation:
• Spam Filtering:
• Recommendation Systems:
• Fraud Detection:
• Scientific Discovery:
Examples for Backward Reasoning:
• Medical Diagnosis:
• Game Playing (Chess, Go):
• Theorem Proving:
• Robot Path Planning:
• Natural Language Question Answering:
THANK YOU

You might also like