0% found this document useful (0 votes)
7 views

UNIT - II

Knowledge representation in AI is essential for storing and utilizing information to solve complex problems, involving various types of knowledge such as declarative, procedural, and meta-knowledge. The AI Knowledge Cycle includes stages like knowledge acquisition, representation, utilization, learning, validation, maintenance, and sharing, allowing systems to adapt and improve. Challenges in knowledge representation include complexity, ambiguity, scalability, and the need for effective reasoning and inference.

Uploaded by

mayankarora2457
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

UNIT - II

Knowledge representation in AI is essential for storing and utilizing information to solve complex problems, involving various types of knowledge such as declarative, procedural, and meta-knowledge. The AI Knowledge Cycle includes stages like knowledge acquisition, representation, utilization, learning, validation, maintenance, and sharing, allowing systems to adapt and improve. Challenges in knowledge representation include complexity, ambiguity, scalability, and the need for effective reasoning and inference.

Uploaded by

mayankarora2457
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 58

UNIT – II

Knowledge representation
Knowledge Representation in AI refers to the way in which artificial intelligence systems store,
organize, and utilize knowledge to solve complex problems. It is a crucial aspect of AI, enabling
machines to mimic human understanding and reasoning. Knowledge representation involves
the creation of data structures and models that can efficiently capture information about the
world, making it accessible and usable by AI algorithms for decision-making, inference, and
learning.

Relationship between Knowledge and Intelligence


 Knowledge as a Foundation: Knowledge provides the necessary information, facts, and
skills that intelligence uses to solve problems and make decisions.
 Intelligence as Application: Intelligence is the ability to learn, reason, and adapt, using
knowledge to perform tasks and solve complex problems.
 Interdependence: Knowledge without intelligence is static, while intelligence without
knowledge lacks the raw material to function effectively.
 Synergy: Effective AI systems require a balance of both knowledge (the “what”) and
intelligence (the “how”) to operate successfully.

The Different Kinds of Knowledge: What to Represent

 Object: The AI needs to know all the facts about the objects in our world domain. E.g., A
keyboard has keys, a guitar has strings, etc.
 Events: The actions which occur in our world are called events.
 Performance: It describes a behavior involving knowledge about how to do things.
 Meta-knowledge: The knowledge about what we know is called meta-knowledge.
 Facts: The things in the real world that are known and proven true.
 Knowledge Base: A knowledge base in artificial intelligence aims to capture human
expert knowledge to support decision-making, problem-solving, and more.

Cycle of Knowledge Representation in Artificial Intelligence


The AI Knowledge Cycle is an ongoing process where AI systems continually acquire, process,
utilize, and refine knowledge to enhance performance.
It consists of these key stages:
1. Knowledge Acquisition: Gathering data and information from various sources, including
databases, sensors, and human input.
2. Knowledge Representation: Organizing and structuring this knowledge using techniques
like ontologies and semantic networks for effective processing.
3. Knowledge Utilization: Applying the structured knowledge to perform tasks, make
decisions, and solve problems through reasoning and inference.
4. Knowledge Learning: Continuously updating the knowledge base by learning from new
data and outcomes using machine learning algorithms.
5. Knowledge Validation and Verification: Ensuring the accuracy, consistency, and reliability
of the knowledge through validation against real-world outcomes.
6. Knowledge Maintenance: Regularly updating the knowledge base to stay relevant and
accurate as the environment or information changes.
7. Knowledge Sharing: Distributing the knowledge to other systems or users, making it
accessible and usable beyond the original AI system.
This cycle repeats itself, with each stage feeding into the next, allowing AI systems to
continually improve and adapt.

Types of Knowledge in AI
1. Declarative Knowledge
 Declarative knowledge refers to facts and information that describe the world, answering
the “what” type of questions.
 Example: Knowing that Paris is the capital of France.
 This knowledge is often stored in databases or knowledge bases and expressed in logical
statements, forming the foundation for more complex reasoning and problem-solving in AI
systems.
2. Procedural Knowledge
 Procedural knowledge is the knowledge of how to perform tasks or processes, answering
the “how” type of questions.
 Example: Steps to solve a mathematical problem or the procedure to start a car.
 This knowledge is embedded in algorithms or control structures, enabling AI systems to
execute tasks, perform actions, and solve problems step-by-step.
3. Meta-Knowledge
 Meta-knowledge is knowledge about knowledge, understanding which types of knowledge
to apply in different situations.
 Example: Knowing when to use a specific algorithm based on the problem at hand.
 Crucial for systems that need to adapt or optimize their performance, meta-knowledge
helps in selecting the most appropriate strategy or knowledge base for a given problem.

4. Heuristic Knowledge
 Heuristic knowledge includes rules of thumb, educated guesses, and intuitive judgments
derived from experience.
 Example: Using an educated guess to approximate a solution when time is limited.
 Often used in problem-solving and decision-making processes where exact solutions are
not feasible, helping AI systems to arrive at good-enough solutions quickly.
5. Structural Knowledge
 Structural knowledge refers to the understanding of how different pieces of knowledge are
organized and related to each other.
 Example: Understanding the hierarchy of concepts in a taxonomy or the relationships
between different entities in a semantic network.
 This knowledge is essential for organizing information within AI systems, allowing for
efficient retrieval, reasoning, and inferencing based on the relationships and structures
defined.
Approaches to Knowledge Representation in AI
There are mainly four approaches to knowledge representation, which are given below:

1. Simple relational knowledge:

 It is the simplest way of storing facts which uses the relational method, and each fact
about a set of the object is set out systematically in columns.
 This approach of knowledge representation is famous in database systems where the
relationship between different entities is represented.
 This approach has little opportunity for inference.
Example: The following is the simple relational knowledge representation.

2. Inheritable knowledge:

 In the inheritable knowledge approach, all data must be stored into a hierarchy of
classes.
 All classes should be arranged in a generalized form or a hierarchal manner.
 In this approach, we apply inheritance property.
 Elements inherit values from other members of a class.
 This approach contains inheritable knowledge which shows a relation between instance
and class, and it is called instance relation.
 Every individual frame can represent the collection of attributes and its value.
 In this approach, objects and values are represented in Boxed nodes.
 We use Arrows which point from objects to their values.
Example:
3. Inferential knowledge:

 Inferential knowledge approach represents knowledge in the form of formal logics.


 This approach can be used to derive more facts.
 It guaranteed correctness.
Example: Let's suppose there are two statements:

 Marcus is a man
 All men are mortal
Then it can represent as;

man(Marcus)
∀x = man (x) ----------> mortal (x)s

4. Procedural knowledge:

 Procedural knowledge approach uses small programs and codes which describes how to do
specific things, and how to proceed.
 In this approach, one important rule is used which is If-Then rule.
 In this knowledge, we can use various coding languages such as LISP language and Prolog
language.
 We can easily represent heuristic or domain-specific knowledge using this approach.
 But it is not necessary that we can represent all cases in this approach.

Requirements for knowledge Representation system:


A good knowledge representation system must possess the following properties.

1. 1. Representational Accuracy:
KR system should have the ability to represent all kind of required knowledge.
2. 2. Inferential Adequacy:
KR system should have ability to manipulate the representational structures to
produce new knowledge corresponding to existing structure.
3. 3. Inferential Efficiency:
The ability to direct the inferential knowledge mechanism into the most productive
directions by storing appropriate guides.
4. 4. Acquisitional efficiency- The ability to acquire the new knowledge easily using
automatic methods.
Key Techniques in Knowledge Representation
1. First-Order Logic (FOL)
First-Order Logic is a formal system used in mathematics, philosophy, and computer science to
represent and reason about propositions involving objects, their properties, and their
relationships. Unlike propositional logic, FOL allows the use of quantifiers (like “forall” and
“exists”) to express more complex statements.
FOL is widely used in AI for knowledge representation and reasoning because it allows for
expressing general rules and facts about the world. For example, FOL can be used to represent
statements like “All humans are mortal” and “Socrates is a human,” enabling AI systems to infer
that “Socrates is mortal.” It provides a powerful and flexible framework for representing
structured knowledge and supports various forms of logical reasoning.
2. Fuzzy Logic
Fuzzy Logic is an approach to knowledge representation that deals with reasoning that is
approximate rather than exact. It allows for the representation of concepts that are not black
and white, but rather fall along a continuum, with degrees of truth ranging from 0 to 1.
Fuzzy Logic is particularly useful in domains where precise information is unavailable or
impractical, such as control systems, decision-making, and natural language processing. For
example, in a climate control system, fuzzy logic can be used to represent concepts like
“warm,” “hot,” or “cold,” and make decisions based on the degree to which these conditions
are met, rather than relying on strict numerical thresholds.

3. Description Logics
Description Logics are a family of formal knowledge representation languages used to describe
and reason about the concepts and relationships within a domain. They are more expressive
than propositional logic but less complex than full first-order logic, making them well-suited for
representing structured knowledge.
Description Logics form the foundation of ontologies used in the Semantic Web and are key to
building knowledge-based systems that require classification, consistency checking, and
inferencing. For example, they can be used to define and categorize different types of products
in an e-commerce system, allowing for automated reasoning about product features,
relationships, and hierarchies.

4. Semantic Web Technologies


Semantic Web Technologies refer to a set of standards and tools designed to enable machines
to understand and interpret data on the web in a meaningful way. Key technologies include
Resource Description Framework (RDF), Web Ontology Language (OWL), and SPARQL, which are
used to represent, query, and reason about knowledge on the web.
These technologies are essential for building intelligent applications that can access, share, and
integrate data across different domains and systems. For example, Semantic Web Technologies
are used in search engines, recommendation systems, and data integration platforms to
provide more relevant and accurate results by understanding the context and meaning of the
data. They enable AI systems to perform tasks like semantic search, data linking, and
automated reasoning over distributed knowledge bases.

Challenges in Knowledge Representation


While knowledge representation is fundamental to AI, it comes with several challenges:
1. Complexity: Representing all possible knowledge about a domain can be highly complex,
requiring sophisticated methods to manage and process this information efficiently.
2. Ambiguity and Vagueness: Human language and concepts are often ambiguous or vague,
making it difficult to create precise representations.
3. Scalability: As the amount of knowledge grows, AI systems must scale accordingly, which
can be challenging both in terms of storage and processing power.
4. Knowledge Acquisition: Gathering and encoding knowledge into a machine-readable
format is a significant hurdle, particularly in dynamic or specialized domains.
5. Reasoning and Inference: AI systems must not only store knowledge but also use it to infer
new information, make decisions, and solve problems. This requires sophisticated
reasoning algorithms that can operate efficiently over large knowledge bases.

Applications of Knowledge Representation in AI


Knowledge representation is applied across various domains in AI, enabling systems to perform
tasks that require human-like understanding and reasoning. Some notable applications include:
1. Expert Systems: These systems use knowledge representation to provide advice or make
decisions in specific domains, such as medical diagnosis or financial planning.
2. Natural Language Processing (NLP): Knowledge representation is used to understand and
generate human language, enabling applications like chatbots, translation systems, and
sentiment analysis.
3. Robotics: Robots use knowledge representation to navigate, interact with environments,
and perform tasks autonomously.
4. Semantic Web: The Semantic Web relies on ontologies and other knowledge
representation techniques to enable machines to understand and process web content
meaningfully.
5. Cognitive Computing: Systems like IBM’s Watson use knowledge representation to process
vast amounts of information, reason about it, and provide insights in fields like healthcare
and research.
Propositional logic

Propositional logic in Artificial intelligence


Propositional logic (PL) is the simplest form of logic where all the statements
are made by propositions. A proposition is a declarative statement which is
either true or false. It is a technique of knowledge representation in logical
and mathematical form.

Example:
1. a) It is Sunday.
2. b) The Sun rises from West (False proposition)
3. c) 3+3= 7(False proposition)
4. d) 5 is a prime number.
Following are some basic facts about propositional logic:

o Propositional logic is also called Boolean logic as it works on 0 and


1.
o In propositional logic, we use symbolic variables to represent the
logic, and we can use any symbol for a representing a proposition,
such A, B, C, P, Q, R, etc.
o Propositions can be either true or false, but it cannot be both.
o Propositional logic consists of an object, relations or function,
and logical connectives.
o These connectives are also called logical operators.
o The propositions and connectives are the basic elements of the
propositional logic.
o Connectives can be said as a logical operator which connects two
sentences.
o A proposition formula which is always true is called tautology, and
it is also called a valid sentence.
o A proposition formula which is always false is
called Contradiction.
o A proposition formula which has both true and false values is
called
o Statements which are questions, commands, or opinions are not
propositions such as "Where is Rohini", "How are you", "What is
your name", are not propositions.

Syntax of propositional logic:


The syntax of propositional logic defines the allowable sentences for the
knowledge representation. There are two types of Propositions:

1. Atomic Propositions
2. Compound propositions

o Atomic Proposition: Atomic propositions are the simple


propositions. It consists of a single proposition symbol. These are
the sentences which must be either true or false.
Example:
Advertisement

1. a) 2+2 is 4, it is an atomic proposition as it is a true fact.


2. b) "The Sun is cold" is also a proposition as it is a false fact.

o Compound proposition: Compound propositions are constructed


by combining simpler or atomic propositions, using parenthesis
and logical connectives.
Example:

1. a) "It is raining today, and street is wet."


2. b) "Ankit is a doctor, and his clinic is in Mumbai."

Logical Connectives:
Logical connectives are used to connect two simpler propositions or
representing a sentence logically. We can create compound propositions with
the help of logical connectives. There are mainly five connectives, which are
given as follows:
1. Negation: A sentence such as ¬ P is called negation of P. A literal can be
either Positive literal or negative literal.
2. Conjunction: A sentence which has ∧ connective such as, P ∧ Q is called a
conjunction.
Example: Rohan is intelligent and hardworking. It can be written as,
P= Rohan is intelligent,
Q= Rohan is hardworking. → P∧ Q.
3. Disjunction: A sentence which has ∨ connective, such as P ∨ Q. is called
disjunction, where P and Q are the propositions.
Example: "Ritika is a doctor or Engineer",
Here P= Ritika is Doctor. Q= Ritika is Doctor, so we can write it as P ∨ Q.
4. Implication: A sentence such as P → Q, is called an implication. Implications
are also known as if-then rules. It can be represented as
If it is raining, then the street is wet.
Let P= It is raining, and Q= Street is wet, so it is represented as P → Q
5. Biconditional: A sentence such as P⇔ Q is a Biconditional sentence,
example If I am breathing, then I am alive
P= I am breathing, Q= I am alive, it can be represented as P ⇔ Q.

Following is the summarized table for Propositional Logic


Connectives:

Truth Table:
In propositional logic, we need to know the truth values of propositions in all
possible scenarios. We can combine all the possible combination with logical
connectives, and the representation of these combinations in a tabular format
is called Truth table. Following are the truth table for all logical connectives:
Truth table with three propositions:
We can build a proposition composing three propositions P, Q, and R. This
truth table is made-up of 8n Tuples as we have taken three proposition
symbols.

Precedence of connectives:
Just like arithmetic operators, there is a precedence order for propositional
connectors or logical operators. This order should be followed while
evaluating a propositional problem. Following is the list of the precedence
order for operators:

Precedence Operators

First Precedence Parenthesis

Second Precedence Negation

Third Precedence Conjunction(AND)


Fourth Precedence Disjunction(OR)

Fifth Precedence Implication

Six Precedence Biconditional

Note: For better understanding use parenthesis to make sure of the correct
interpretations. Such as ¬R∨ Q, It can be interpreted as (¬R) ∨ Q.

Logical equivalence:
Logical equivalence is one of the features of propositional logic. Two
propositions are said to be logically equivalent if and only if the columns in the
truth table are identical to each other.

Let's take two propositions A and B, so for logical equivalence, we can write it
as A⇔B. In below truth table we can see that column for ¬A∨ B and A→B,
are identical hence A is Equivalent to B

Properties of Operators:
o Commutativity:
o P∧ Q= Q ∧ P, or
o P ∨ Q = Q ∨ P.
o Associativity:
o (P ∧ Q) ∧ R= P ∧ (Q ∧ R),
o (P ∨ Q) ∨ R= P ∨ (Q ∨ R)
o Identity element:
o P ∧ True = P,
o P ∨ True= True.
o Distributive:
o P∧ (Q ∨ R) = (P ∧ Q) ∨ (P ∧ R).
o P ∨ (Q ∧ R) = (P ∨ Q) ∧ (P ∨ R).
o DE Morgan's Law:
o ¬ (P ∧ Q) = (¬P) ∨ (¬Q)
o ¬ (P ∨ Q) = (¬ P) ∧ (¬Q).
o Double-negation elimination:
o ¬ (¬P) = P.

Limitations of Propositional logic:


o We cannot represent relations like ALL, some, or none with
propositional logic. Example:
o All the girls are intelligent.
o Some apples are sweet.
o Propositional logic has limited expressive power.
o In propositional logic, we cannot describe statements in terms of
their properties or logical relationships.
Knowledge Representation in First-Order
Logic
First-order logic (FOL), also known as predicate logic, is a
powerful formalism used for knowledge representation
in artificial intelligence and computer science. It extends
propositional logic by allowing the use of quantifiers and
predicates, enabling the representation of complex statements
about objects and their relationships. Here are the key
components and concepts of knowledge representation in first-
order logic:
Key Components of First-Order Logic
1. Constants:
 Definition: Constants are symbols that represent
specific objects in the domain.
 Examples: If a, b, and c are constants, they might
represent specific individuals like Alice, Bob, and
Charlie.
2. Variables:
 Definition: Variables are symbols that can represent
any object in the domain.
 Examples: Variables such as x, y, and z can represent
any object in the domain.
3. Predicates:
 Definition: Predicates represent properties of objects or
relationships between objects.
 Examples: P(x) could mean “x is a person”, while Q(x,
y) could mean “x is friends with y”.
4. Functions:
 Definition: Functions map objects to other objects.
 Examples: f(x) could represent a function that maps an
object x to another object, like “the father of x”.
5. Quantifiers:
 Universal Quantifier (∀): Indicates that a statement
applies to all objects in the domain. For example, ∀x
P(x) means “P(x) is true for all x”.
 Existential Quantifier (∃): Indicates that there exists at
least one object in the domain for which the statement is
true. For example, ∃x P(x) means “There exists an x
such that P(x) is true”.
6. Logical Connectives:
 Definition: These

include ∧ (and), ∨ (or), ¬ (not), → (implies), and ↔ (if and


only if).
 Examples: P(x) ∧ Q(x, y) means “P(x) and Q(x, y) are
both true”.
7. Equality:
 Definition: States that two objects are the same.
 Examples: x = y asserts that x and y refer to the same
object.
Syntax of First-Order Logic
The syntax of FOL defines the rules for constructing well-
formed formulas:
 Atomic Formulas: The simplest formulas, which can be
predicates applied to terms (e.g., P(a), Q(x, y)).
 Complex Formulas: Formed by combining atomic formulas
using logical connectives and quantifiers (e.g., \forall x
(P(x) \lor \lnot Q(x, f(y)))).
Semantics of First-Order Logic
The semantics define the meaning of FOL statements:
 Domain: A non-empty set of objects over which the
variables range.
 Interpretation: Assigns meanings to the constants,
functions, and predicates, specifying which objects the
constants refer to, which function the function symbols
denote, and which relations the predicate symbols denote.
 Truth Assignment: Determines the truth value of each
formula based on the interpretation.
Examples of Knowledge Representation in
FOL¶
1. Facts: Simple statements about objects.
 P(a) (Object a has property P).
 Q(a, b) (Objects a and b are related by Q ).
2. Rules: Implications that describe general relationships.
 ∀x (P(x) → Q(x)) (If x has property P , then x also has
property Q).
3. Existential Statements: Indicate the existence of objects
with certain properties.
 \exists x \, P(x) (There exists an x such that P(x) is
true).
4. Universal Statements: Apply to all objects in the domain.
 \forall x \, (P(x) \lor \lnot Q(x)) (For all x, either P(x) is
true or Q(x) is not true).
Example Knowledge Base in FOL
Consider a knowledge base representing a simple family
relationship:
1. Constants:
 John, Mary
2. Predicates:
 Parent(x, y): x is a parent of y.
 Male(x): x is male.
 Female(x): x is female.
3. Statements:
 Parent(John, Mary)
 Male(John)
 Female(Mary)
 \forall x \, \forall y \, (Parent(x, y) \rightarrow \lnot(x =
y)) (No one is their own parent).
Applications of First-Order Logic in
Knowledge Representation
1. Expert Systems: FOL is used to represent expert
knowledge in various domains such as medicine, finance,
and engineering, enabling systems to reason and make
decisions based on logical rules.
2. Natural Language Processing: FOL provides a formal
framework for representing the meaning of natural
language sentences, facilitating semantic analysis and
understanding in NLP tasks.
3. Semantic Web: FOL is foundational to ontologies and
knowledge graphs on the Semantic Web, enabling precise
and machine-interpretable representations of knowledge.
4. Robotics: FOL is employed in robotic systems to represent
spatial relationships, object properties, and task constraints,
aiding in robot planning, navigation, and manipulation.
5. Database Systems: FOL-based query languages such as
SQL enable expressive querying and manipulation of
relational databases, allowing for complex data retrieval
and manipulation.
Challenges & Limitations of First-Order
Logic in Knowledge Representation
Challenges of First-Order Logic in Knowledge
Representation
1. Complexity: Representing certain real-world domains
accurately in FOL can lead to complex and unwieldy
formulas, making reasoning and inference computationally
expensive.
2. Expressiveness Limitations: FOL has limitations in
representing uncertainty, vagueness, and probabilistic
relationships, which are common in many AI applications.
3. Knowledge Acquisition: Encoding knowledge into FOL
requires expertise and manual effort, making it challenging
to scale and maintain large knowledge bases.
4. Inference Scalability: Reasoning in FOL can be
computationally intensive, especially in large knowledge
bases, requiring efficient inference algorithms
and optimization techniques.
5. Handling Incomplete Information: FOL struggles with
representing and reasoning with incomplete or uncertain
information, which is common in real-world applications.
Limitations of First-Order Logic in Knowledge
Representation
1. Inability to Represent Recursive Structures: FOL cannot
directly represent recursive structures, limiting its ability to
model certain types of relationships and processes.
2. Lack of Higher-Order Reasoning: FOL lacks support for
higher-order logic, preventing it from representing and
reasoning about properties of predicates or functions.
3. Difficulty in Representing Context and Dynamics: FOL
struggles with representing dynamic or context-dependent
knowledge, such as temporal relationships or changes over
time.
4. Limited Representation of Non-binary Relations: FOL
primarily deals with binary relations, making it less suitable
for representing complex relationships involving multiple
entities.
5. Difficulty in Handling Non-monotonic Reasoning: FOL
is not well-suited for non-monotonic reasoning, where new
information can lead to retraction or modification of
previously inferred conclusions.
First-Order Logic
o First-order logic is another way of knowledge representation in artificial intelligence. It is
an extension to propositional logic.
o FOL is sufficiently expressive to represent the natural language statements in a concise
way.
o First-order logic is also known as Predicate logic or First-order predicate logic. First-order
logic is a powerful language that develops information about the objects in a more easy
way and can also express the relationship between those objects.
o First-order logic (like natural language) does not only assume that the world contains facts
like propositional logic but also assumes the following things in the world:
o Objects: A, B, people, numbers, colors, wars, theories, squares, pits, wumpus, ......
o Relations: It can be unary relation such as: red, round, is adjacent, or n-any
relation such as: the sister of, brother of, has color, comes between
o Function: Father of, best friend, third inning of, end of, ......

o As a natural language, first-order logic also has two main parts:


o Syntax
o Semantics

Syntax of First-Order logic:

The syntax of FOL determines which collection of symbols is a logical expression in first-order
logic. The basic syntactic elements of first-order logic are symbols. We write statements in short-
hand notation in FOL.

Basic Elements of First-order logic:

Following are the basic elements of FOL syntax:


Atomic sentences:

o Atomic sentences are the most basic sentences of first-order logic. These sentences are
formed from a predicate symbol followed by a parenthesis with a sequence of terms.
o We can represent atomic sentences as Predicate (term1, term2, ......, term n).

Example: Ravi and Ajay are brothers: => Brothers(Ravi, Ajay).


Chinky is a cat: => cat (Chinky).

Complex Sentences:

o Complex sentences are made by combining atomic sentences using connectives.


First-order logic statements can be divided into two parts:
o Subject: Subject is the main part of the statement.
o Predicate: A predicate can be defined as a relation, which binds two atoms together in a
statement.

Consider the statement: "x is an integer.", it consists of two parts, the first part x is the subject of
the statement and second part "is an integer," is known as a predicate.
Quantifiers in First-order logic:

o A quantifier is a language element which generates quantification, and quantification


specifies the quantity of specimen in the universe of discourse.
o These are the symbols that permit to determine or identify the range and scope of the
variable in the logical expression. There are two types of quantifier:
o Universal Quantifier, (for all, everyone, everything)
o Existential quantifier, (for some, at least one).

Universal Quantifier:

Universal quantifier is a symbol of logical representation, which specifies that the statement
within its range is true for everything or every instance of a particular thing.

The Universal quantifier is represented by a symbol ∀, which resembles an inverted A.

Note: In universal quantifier we use implication "→".

If x is a variable, then ∀x is read as:

o For all x

o For each x
o For every x.

Example:

All man drink coffee.


Let a variable x which refers to a cat so all x can be represented in UOD as below:

∀x man(x) → drink (x, coffee).

It will be read as: There are all x where x is a man who drink coffee.

Existential Quantifier:

Existential quantifiers are the type of quantifiers, which express that the statement within its
scope is true for at least one instance of something.

It is denoted by the logical operator ∃, which resembles as inverted E. When it is used with a
predicate variable then it is called as an existential quantifier.

Note: In Existential quantifier we always use AND or Conjunction symbol ( ∧).

If x is a variable, then existential quantifier will be ∃x or ∃(x). And it will be read as:

Advertisement

o There exists a 'x.'


o For some 'x.'
o For at least one 'x.'

Example:

Some boys are intelligent.

∃x: boys(x) ∧ intelligent(x)

It will be read as: There are some x where x is a boy who is intelligent.

Points to remember:

o The main connective for universal quantifier ∀ is implication →.


o The main connective for existential quantifier ∃ is and ∧.

Properties of Quantifiers:

o In universal quantifier, ∀x∀y is similar to ∀y∀x.


o In Existential quantifier, ∃x∃y is similar to ∃y∃x.
o ∃x∀y is not similar to ∀y∃x.
Some Examples of FOL using quantifier:

1. All birds fly.


In this question the predicate is "fly(bird)."
And since there are all birds who fly so it will be represented as follows.
∀x bird(x) →fly(x).

2. Every man respects his parent.


In this question, the predicate is "respect(x, y)," where x=man, and y= parent.
Since there is every man so will use ∀, and it will be represented as follows:
∀x man(x) → respects (x, parent).

3. Some boys play cricket.


In this question, the predicate is "play(x, y)," where x= boys, and y= game. Since there are some
boys so we will use ∃, and it will be represented as:
∃x boys(x) → play(x, cricket).

4. Not all students like both Mathematics and Science.


In this question, the predicate is "like(x, y)," where x= student, and y= subject.
Since there are not all students, so we will use ∀ with negation, so following representation for
this:
¬∀ (x) [ student(x) → like(x, Mathematics) ∧ like(x, Science)].

5. Only one student failed in Mathematics.


In this question, the predicate is "failed(x, y)," where x= student, and y= subject.
Since there is only one student who failed in Mathematics, so we will use following representation
for this:
∃(x) [ student(x) → failed (x, Mathematics) ∧∀ (y) [¬(x==y) ∧ student(y) → ¬failed (x,
Mathematics)].

Free and Bound Variables:

The quantifiers interact with variables which appear in a suitable way. There are two types of
variables in First-order logic which are given below:

Free Variable: A variable is said to be a free variable in a formula if it occurs outside the scope of
the quantifier.
Example: ∀x ∃(y)[P (x, y, z)], where z is a free variable.

Bound Variable: A variable is said to be a bound variable in a formula if it occurs within the scope
of the quantifier.

Example: ∀x [A (x) B( y)], here x and y are the bound variables.


Inference in First-Order Logic
In First-Order Logic, inference is used to derive new facts or sentences
from existing ones. Before we get into the FOL inference rule, it's
important to understand some basic FOL terminology.

Substitution:

Substitution is a basic procedure that is applied to terms and


formulations. It can be found in all first-order logic inference systems.
When there are quantifiers in FOL, the substitution becomes more
complicated. When we write F[a/x], we are referring to the substitution
of a constant "a" for the variable "x."

[ Note: first-order logic can convey facts about some or all of the
universe's objects. ]

Equality:

In First-Order Logic, atomic sentences are formed not only via the use of
predicate and words, but also through the application of equality. We
can do this by using equality symbols, which indicate that the two
terms relate to the same thing.

Example: Brother (John) = Smith.

In the above example, the object referred by the Brother (John) is close
to the object referred by Smith. The equality symbol can be used with
negation to portray that two terms are not the same objects.

Example: ¬(x=y) which is equivalent to x ≠y.

FOL inference rules for quantifier:


First-order logic has inference rules similar to propositional logic,
therefore here are some basic inference rules in FOL:

 Universal Generalization
 Universal Instantiation
 Existential Instantiation
 Existential introduction

1. Universal Generalization:

 Universal generalization is a valid inference rule that states that if


premise P(c) is true for any arbitrary element c in the universe of
discourse, we can arrive at the conclusion x P. (x).
 It can be represented as:

 If we want to prove that every element has a similar property, we


can apply this rule.
 x must not be used as a free variable in this rule.

Example: Let's represent, P(c): "A byte contains 8 bits", so "All bytes
contain 8 bits."for ∀ x P(x) , it will also be true.

2. Universal Instantiation:

 A valid inference rule is universal instantiation, often known as


universal elimination or UI. It can be used to add additional
sentences many times.
 The new knowledge base is logically equal to the existing
knowledge base.
 We can infer any phrase by replacing a ground word for the
variable, according to UI
 The UI rule say that we can infer any sentence P(c) by substituting a
ground term c (a constant within domain x) from ∀ x P(x) for any
object in the universe of discourse.
 It can be represented as

Example: 1 IF "Every person like ice-cream"=> ∀x P(x) so we can infer


that
"John likes ice-cream" => P(c)

Example: 2 Let's take a famous example,


"All kings who are greedy are Evil." So let our knowledge base contains
this detail as in the form of FOL: ∀x king(x) ∧ greedy (x) → Evil (x),
We can infer any of the following statements using Universal
Instantiation from this information:

 King(John) ∧ Greedy (John) → Evil (John),


 King(Richard) ∧ Greedy (Richard) → Evil (Richard),
 We can infer any phrase by replacing a ground word for the
variable, according to UI
 King(Father(John)) ∧ Greedy (Father(John)) → Evil
(Father(John)),

3. Existential Instantiation:

 Existential instantiation is also known as Existential Elimination, and


it is a legitimate first-order logic inference rule.
 It can only be used to replace the existential sentence once.
 Although the new KB is not conceptually identical to the old KB, it
will be satisfiable if the old KB was.
 This rule states that for a new constant symbol c, one can deduce
P(c) from the formula given in the form of x P(x).
 The only constraint with this rule is that c must be a new word for
which P(c) is true.
 It's written like this:

Example: 1

From the given sentence: ∃x Crown(x) ∧ OnHead(x, John),

So we can infer: Crown(K) ∧ OnHead( K, John), as long as K does not


appear in the knowledge base.

 The above used K is a constant symbol, which is known as Skolem


constant.
 The Existential instantiation is a special case of Skolemization
process.

4. Existential introduction

 An existential generalization is a valid inference rule in first-order


logic that is also known as an existential introduction.
 This rule argues that if some element c in the universe of discourse
has the property P, we can infer that something in the universe has
the attribute P.
 It's written like this:
 Example: Let's say that,
"Priyanka got good marks in English."
"Therefore, someone got good marks in English."

Generalized Modus Ponens Rule:

In FOL, we use a single inference rule called Generalized Modus Ponens


for the inference process. It's a modified form of Modus ponens.

"P implies Q, and P is declared to be true, hence Q must be true,"


summarizes Generalized Modus Ponens.

Modus Ponens states that for atomic phrases pi, pi', q. Where there is a
substitution θ such that SUBST (θ, pi',) = SUBST(θ, pi), it can be
represented as:

Example: We will use this rule for Kings are evil, so we will find
some x such that x is king, and x is greedy so we can infer that x is
evil.
Unification

 Unification is the process of finding a substitute that makes two


separate logical atomic expressions identical. The substitution
process is necessary for unification.
 It accepts two literals as input and uses substitution to make them
identical.
 Let Ψ1 and Ψ2 be two atomic sentences, and be a unifier such
that Ψ1𝜎 = Ψ2𝜎, then UNIFY(Ψ1, Ψ2)can be written.
 Example: Find the MGU for Unify{King(x), King(John)}
Let Ψ1 = King(x), Ψ2 = King(John),

Substitution θ = {John/x} is a unifier for these atoms, and both


equations will be equivalent if this substitution is used.

 For unification, the UNIFY algorithm is employed, which takes two


atomic statements and returns a unifier for each of them (If any
exist).
 All first-order inference techniques rely heavily on unification.
 If the expressions do not match, the result is failure.
 The replacement variables are referred to as MGU (Most General
Unifier).
o
E.g. Let's say there are two different expressions, P(x, y), and
P(a, f(z)).
In this case, we must make both of the preceding assertions
identical. For this, we'll make a substitute.
P(x, y)......... (i)
P(a, f(z))......... (ii)
Substitute x with a, and y with f(z) in the first expression, and
it will be represented as a/x and f(z)/y.
The first expression will be identical to the second expression
with both replacements, and the substitution set will be: [a/x ,
f(z)/y].

Conditions for Unification


The following are some fundamental requirements for unification:

 Atoms or expressions with various predicate symbols can never be


united.
 Both phrases must have the same number of arguments.
 If two comparable variables appear in the same expression,
unification will fail.

Unification Algorithm:
Algorithm: Unify(Ψ1, Ψ2)
Implementation of the Algorithm

Step 1: Begin by making the substitute set empty.


Step 2: Unify atomic sentences in a recursive manner:
a.Check for expressions that are identical.
b.If one expression is a variable vΨ i, and the other is a term ti

which does not contain variable vi, then:


a. Substitute t i / vi in the existing substitutions

b.Add t i / vi to the substitution setlist.

arguments must be the same in both the expression. c. If


both the expressions are functions, then function name must be similar,
and the number of
Find the most general unifier for each pair of the following atomic
statements (If exist).
1. Find the MGU of {p(f(a), g(Y)) and p(X, X)}
Sol: S 0 => Here, Ψ1 = p(f(a), g(Y)), and Ψ2 = p(X, X)

SUBST θ = {f(a) / X}
S 1 => Ψ1 = p(f(a), g(Y)), and Ψ2 = p(f(a), f(a))

SUBST θ = {f(a) / g(y)}, Unification failed.


Unification is not possible for these expressions.
2. Find the MGU of {p(b, X, f(g(Z))) and p(Z, f(Y), f(Y))}
S0 => { p(b, X, f(g(Z))); p(Z, f(Y), f(Y))}
SUBST θ={b/Z}

S1 => { p(b, X, f(g(b))); p(b, f(Y), f(Y))}


SUBST θ={f(Y) /X}

S2 => { p(b, f(Y), f(g(b))); p(b, f(Y), f(Y))}


SUBST θ= {g(b) /Y}
S2 => { p(b, f(g(b)), f(g(b)); p(b, f(g(b)), f(g(b))} Unified Successfully.
And Unifier = { b/Z, f(Y) /X , g(b) /Y}.
3. Find the MGU of {p (X, X), and p (Z, f(Z))}
Here, Ψ1 = {p (X, X), and Ψ2 = p (Z, f(Z))
S0 => {p (X, X), p (Z, f(Z))}
SUBST θ= {X/Z}
S 1 => {p (Z, Z), p (Z, f(Z))}

SUBST θ= {f(Z) / Z}, Unification Failed.


Therefore, unification is not possible for these expressions.
5. Find the MGU of Q(a, g(x, a), f(y)), Q(a, g(f(b), a), x)}
Here, Ψ1 = Q(a, g(x, a), f(y)), and Ψ2 = Q(a, g(f(b), a), x)
S0 => {Q(a, g(x, a), f(y)); Q(a, g(f(b), a), x)}
SUBST θ= {f(b)/x}
S1 => {Q(a, g(f(b), a), f(y)); Q(a, g(f(b), a), f(b))}

SUBST θ= {b/y}
SUBST θ={f(Y) /X}

S2 => { p(b, f(Y), f(g(b))); p(b, f(Y), f(Y))}


S1 => {Q(a, g(f(b), a), f(b)); Q(a, g(f(b), a), f(b))}, Successfully Unified.
Unifier: [a/a, f(b)/x, b/y].
6. UNIFY(knows(Richard, x), knows(Richard, John))
Here, Ψ1 = knows(Richard, x), and Ψ2 = knows(Richard, John)
S0 => { knows(Richard, x); knows(Richard, John)}
S SUBST θ= {John/x}
S1 => { knows(Richard, John); knows(Richard, John)}, Successfully
Unified.
Unifier: {John/x}.
Resolution
Resolution is a method of theorem proof that involves constructing
refutation proofs, or proofs by contradictions. It was created in 1965 by a
mathematician named John Alan Robinson.

When several statements are supplied and we need to prove a


conclusion from those claims, we employ resolution. In proofs by
resolutions, unification is a crucial idea. Resolution is a single inference
rule that can work on either the conjunctive normal form or the
clausal form efficiently.

Clause: A clause is a disjunction of literals (an atomic sentence). It's


sometimes referred to as a unit clause.

Conjunctive Normal Form (CNF): Conjunctive normal form (CNF) is a


sentence that is represented as a conjunction of clauses.

Note: Firstly learn the FOL in AI, to better understand this topic.

The resolution inference rule:

The propositional rule is just a lifted version of the resolution rule for
first-order logic. If two clauses include complementary literals that are
expected to be standardized apart so that they share no variables,
resolution can resolve them.

Where li and mj are complementary literals, there is a resolution in FOL.


Because it only resolves perfectly, this rule is also known as the binary
resolution rule .
Example:

We can determine two clauses which are given below:

[Animal (g(x) V Loves (f(x), x)] and [¬ Loves(a, b) V ¬Kills(a, b)]

Two complimentary literals are: Loves (f(x), x) and ¬ Loves (a, b)

These literals could be unified with unifier θ= [a/f(x), and b/x] , and it
will bring about a resolvent clause:

[Animal (g(x) V ¬ Kills(f(x), x)].

Steps for Resolution:

1. Conversion of facts into first-order logic


2. Convert FOL statements into CNF
3. Negate the statement which needs to prove (proof by
contradiction)
4. Draw resolution graph (unification)

To better comprehend all of the preceding phases, we shall use


resolution as an example.

Example:

 Anything anyone eats and not killed is food.


 Anil eats peanuts and still alive
 Harry eats everything that Anil eats.
 John likes all kind of food.
 Apple and vegetable are food
 Prove by resolution that:
 John likes peanuts.

Step-1: Conversion of Facts into FOL

We'll start by converting all of the given propositions to first-order logic.

Step-2: Conversion of FOL into CNF

Converting FOL to CNF is essential in first-order logic resolution because


CNF makes resolution proofs easier.

 Eliminate all implication (→) and rewrite:


1. ∀x ¬ food(x) V likes(John, x)
2. food(Apple) Λ food(vegetables)
3. ∀x ∀y ¬ [eats(x, y) Λ ¬ killed(x)] V food(y)
4. eats (Anil, Peanuts) Λ alive(Anil)
5. ∀x ¬ eats(Anil, x) V eats(Harry, x)
6. ∀x¬ [¬ killed(x) ] V alive(x)
7. ∀x ¬ alive(x) V ¬ killed(x)
8. likes(John, Peanuts).
 Move negation (¬)inwards and rewrite
1. ∀x ¬ food(x) V likes(John, x)
2. food(Apple) Λ food(vegetables)
3. ∀x ∀y ¬ eats(x, y) V killed(x) V food(y)
4. eats (Anil, Peanuts) Λ alive(Anil)
5. ∀x ¬ eats(Anil, x) V eats(Harry, x)
6. ∀x ¬killed(x) ] V alive(x)
7. ∀x ¬ alive(x) V ¬ killed(x)
8. likes(John, Peanuts).
 Rename variables or standardize variables
1. ∀x ¬ food(x) V likes(John, x)
2. food(Apple) Λ food(vegetables)
3. ∀y ∀z ¬ eats(y, z) V killed(y) V food(z)
4. eats (Anil, Peanuts) Λ alive(Anil)
5. ∀x ¬ eats(Anil, x) V eats(Harry, x)
6. ∀x ¬killed(x) ] V alive(x)
7. ∀x ¬ alive(x) V ¬ killed(x)
8. likes(John, Peanuts).
 Eliminate existential instantiation quantifier by elimination.
We will eliminate existential quantifier ∃ in this step, which is
referred to as Skolemization. However, because there is no
existential quantifier in this example problem, all of the assertions
in this phase will be the same
 Drop Universal quantifiers. We'll remove all universal quantifiers
∃ in this phase because none of the statements are implicitly
quantified, therefore we don't need them
1. ¬ food(x) V likes(John, x)
2. food(Apple)
3. food(vegetables)
4. ¬ eats(y, z) V killed(y) V food(z)
5. eats (Anil, Peanuts)
6. alive(Anil)
7. ¬ eats(Anil, w) V eats(Harry, w)
8. killed(g) V alive(g)
9. ¬ alive(k) V ¬ killed(k)
10. likes(John, Peanuts).
[ Note: Statements "food(Apple) Λ food(vegetables)" and
"eats (Anil, Peanuts) Λ alive(Anil)" can be written in two
independent statements. ]
 Distribute conjunction ∧ over disjunction ¬. This step will not
make any change in this problem.

Step 3: Reverse the statement that needs to be proven.

We will use negation to write the conclusion assertions in this statement,


which will be written as "likes" (John, Peanuts)

As a result, the conclusion's negation has been demonstrated to


constitute a total contradiction with the given collection of truths.
Step 4: Create a graph of resolution:

In this stage, we'll use a resolution tree and substitution to solve the
problem. It will be as follows for the aforesaid problem:

Explanation of Resolution graph:

 First step: ¬likes(John, Peanuts) , and likes(John, x) get


resolved(canceled) by substitution of {Peanuts/x}, and we are left
with ¬ food(Peanuts)
 Second step: ¬ food(Peanuts) , and food(z) get resolved
(canceled) by substitution of { Peanuts/z}, and we are left with¬
eats(y, Peanuts) V killed(y) .
 Third step: ¬ eats(y, Peanuts) and eats (Anil, Peanuts) get
resolved by substitution {Anil/y}, and we are left with Killed(Anil).
 Fourth step: Killed(Anil) and ¬ killed(k) get resolve by
substitution {Anil/k}, and we are left with ¬ alive(Anil) .
 Last step:¬ alive(Anil) and alive(Anil) get resolve
Natural Deduction

A proof of proposition P in natural deduction starts from axioms and


assumptions and derives P with all assumptions discharged. Every step in the
proof is an instance of an inference rule with metavariables substituted
consistently with expressions of the appropriate syntactic class.

Example

For example, here is a proof of the proposition (A ⇒ B ⇒ C) ⇒ (A ∧ B ⇒ C).

The final step in the proof is to derive (A ⇒ B ⇒ C) ⇒ (A ∧ B ⇒ C) from (A ∧


B ⇒ C), which is done using the rule (⇒-intro), discharging the assumption [x
: A ⇒ B ⇒ C]. To see how this rule generates the proof step, substitute for the
metavariables P, Q, x in the rule as follows: P = (A ⇒ B ⇒ C), Q = (A ∧ B ⇒ C),
and x = x. The immediately previous step uses the same rule, but with a
different substitution: P = A ∧ B, Q = C, x = y.
The proof tree for this example has the following form, with the proved
proposition at the root and axioms and assumptions at the leaves.

A proposition that has a complete proof in a deductive system is called


a theorem of that system.
Soundness and Completeness

A measure of a deductive system's power is whether it is powerful enough to


prove all true statements. A deductive system is said to be complete if all true
statements are theorems (have proofs in the system). For propositional logic
and natural deduction, this means that all tautologies must have natural
deduction proofs. Conversely, a deductive system is called sound if all
theorems are true. The proof rules we have given above are in fact sound and
complete for propositional logic: every theorem is a tautology, and every
tautology is a theorem.
Finding a proof for a given tautology can be difficult. But once the proof is
found, checking that it is indeed a proof is completely mechanical, requiring no
intelligence or insight whatsoever. It is therefore a very strong argument that
the thing proved is in fact true.
We can also make writing proofs less tedious by adding more rules that
provide reasoning shortcuts. These rules are sound if there is a way to convert
a proof using them into a proof using the original rules. Such added rules are
called admissible.
Procedural versus declarative knowledge
Procedural Knowledge:
Procedural Knowledge also known as Interpretive knowledge,
is the type of knowledge in which it clarifies how a particular
thing can be accomplished. It is not so popular because it is
generally not used.
It emphasize how to do something to solve a given problem.
Let’s see it with an example:
var a=[1, 2, 3, 4, 5];
var b=[];
for(var i=0;i<a.length;i++)
{
b.push(a[i]);
}
console.log(b);

Output is:
[1, 2, 3, 4, 5]

Declarative Knowledge:
Declarative Knowledge also known as Descriptive knowledge,
is the type of knowledge which tells the basic knowledge about
something and it is more popular than Procedural Knowledge.
It emphasize what to do something to solve a given problem.
Let’s see it with an example:
var a=[1, 2, 3, 4, 5];
var b=a.map(function(number)
{
return number*1});
console.log(b);

Output is:
[1, 2, 3, 4, 5]

In both example we can see that the output of a given problem


is same because the only difference in that two methods to
achieve the output or solution of problem.

Difference the Procedural and Declarative Knowledge:


S.NO Procedural Knowledge Declarative Knowledge

1. It is also known as Interpretive knowledge. It is also known as Descriptive knowledge.

Procedural Knowledge means how a particular While Declarative Knowledge means basic
2.
thing can be accomplished. knowledge about something.

Procedural Knowledge is generally not used


3. Declarative Knowledge is more popular.
means it is not more popular.

Procedural Knowledge can’t be easily Declarative Knowledge can be easily


4.
communicate. communicate.

Procedural Knowledge is generally process Declarative Knowledge is data oriented in


5.
oriented in nature. nature.

In Procedural Knowledge debugging and In Declarative Knowledge debugging and


6.
validation is not easy. validation is easy.

Procedural Knowledge is less effective in Declarative Knowledge is more effective in


7.
competitive programming. competitive programming.
Forward Chaining and backward chaining
in AI
In artificial intelligence, forward and backward chaining is one of the important
topics, but before understanding forward and backward chaining lets first
understand that from where these two terms came.

Inference engine:
The inference engine is the component of the intelligent system in artificial
intelligence, which applies logical rules to the knowledge base to infer new
information from known facts. The first inference engine was part of the
expert system. Inference engine commonly proceeds in two modes, which
are:

1. Forward chaining
2. Backward chaining

Horn Clause and Definite clause:

Horn clause and definite clause are the forms of sentences, which enables
knowledge base to use a more restricted and efficient inference algorithm.
Logical inference algorithms use forward and backward chaining approaches,
which require KB in the form of the first-order definite clause.

Definite clause: A clause which is a disjunction of literals with exactly one


positive literal is known as a definite clause or strict horn clause.

Horn clause: A clause which is a disjunction of literals with at most one


positive literal is known as horn clause. Hence all the definite clauses are
horn clauses.

Example: (¬ p V ¬ q V k). It has only one positive literal k.

It is equivalent to p ∧ q → k.

Forward Chaining
Forward chaining is also known as a forward deduction or forward reasoning
method when using an inference engine. Forward chaining is a form of
reasoning which start with atomic sentences in the knowledge base and
applies inference rules (Modus Ponens) in the forward direction to extract
more data until a goal is reached.

The Forward-chaining algorithm starts from known facts, triggers all rules
whose premises are satisfied, and add their conclusion to the known facts.
This process repeats until the problem is solved.

Properties of Forward-Chaining:

o It is a down-up approach, as it moves from bottom to top.


o It is a process of making a conclusion based on known facts or
data, by starting from the initial state and reaches the goal state.
o Forward-chaining approach is also called as data-driven as we
reach to the goal using available data.
o Forward -chaining approach is commonly used in the expert
system, such as CLIPS, business, and production rule systems.

Consider the following famous example which we will use in both


approaches:

Example:
"As per the law, it is a crime for an American to sell weapons to hostile
nations. Country A, an enemy of America, has some missiles, and all
the missiles were sold to it by Robert, who is an American citizen."

Prove that "Robert is criminal."


To solve the above problem, first, we will convert all the above facts into first-
order definite clauses, and then we will use a forward-chaining algorithm to
reach the goal.

Facts Conversion into FOL:


o It is a crime for an American to sell weapons to hostile nations.
(Let's say p, q, and r are variables)
American (p) ∧ weapon(q) ∧ sells (p, q, r) ∧ hostile(r) →
Criminal(p) ...(1)
o Country A has some missiles. ?p Owns(A, p) ∧ Missile(p). It can
be written in two definite clauses by using Existential Instantiation,
introducing new Constant T1.
Owns(A, T1) ......(2)
Missile(T1) .......(3)
o All of the missiles were sold to country A by Robert.
?p Missiles(p) ∧ Owns (A, p) → Sells (Robert, p, A) ......(4)
o Missiles are weapons.
Missile(p) → Weapons (p) .......(5)
o Enemy of America is known as hostile.
Enemy(p, America) →Hostile(p) ........(6)
o Country A is an enemy of America.
Enemy (A, America) .........(7)
o Robert is American
American(Robert). ..........(8)

Forward chaining proof:


Step-1:

In the first step we will start with the known facts and will choose the
sentences which do not have implications, such as: American(Robert),
Enemy(A, America), Owns(A, T1), and Missile(T1). All these facts will be
represented as below.
Step-2:

At the second step, we will see those facts which infer from available facts
and with satisfied premises.

Rule-(1) does not satisfy premises, so it will not be added in the first iteration.

Rule-(2) and (3) are already added.

Rule-(4) satisfy with the substitution {p/T1}, so Sells (Robert, T1, A) is


added, which infers from the conjunction of Rule (2) and (3).

Rule-(6) is satisfied with the substitution(p/A), so Hostile(A) is added and


which infers from Rule-(7).

Step-3:

At step-3, as we can check Rule-(1) is satisfied with the


substitution {p/Robert, q/T1, r/A}, so we can add Criminal(Robert) which
infers all the available facts. And hence we reached our goal statement.
Hence it is proved that Robert is Criminal using forward chaining
approach.

B. Backward Chaining:
Backward-chaining is also known as a backward deduction or backward
reasoning method when using an inference engine. A backward chaining
algorithm is a form of reasoning, which starts with the goal and works
backward, chaining through rules to find known facts that support the goal.

Properties of backward chaining:

o It is known as a top-down approach.


o Backward-chaining is based on modus ponens inference rule.
o In backward chaining, the goal is broken into sub-goal or sub-goals
to prove the facts true.
o It is called a goal-driven approach, as a list of goals decides which
rules are selected and used.
o Backward -chaining algorithm is used in game theory, automated
theorem proving tools, inference engines, proof assistants, and
various AI applications.
o The backward-chaining method mostly used a depth-first
search strategy for proof.
Example:
In backward-chaining, we will use the same above example, and will rewrite
all the rules.

o American (p) ∧ weapon(q) ∧ sells (p, q, r) ∧ hostile(r) →


Criminal(p) ...(1)
Owns(A, T1) ........(2)
o Missile(T1)
o ?p Missiles(p) ∧ Owns (A, p) → Sells (Robert, p, A)
......(4)
o Missile(p) → Weapons (p) .......(5)
o Enemy(p, America) →Hostile(p) ........(6)
o Enemy (A, America) .........(7)
o American(Robert). ..........(8)

Backward-Chaining proof:
In Backward chaining, we will start with our goal predicate, which
is Criminal(Robert), and then infer further rules.

Step-1:

At the first step, we will take the goal fact. And from the goal fact, we will infer
other facts, and at last, we will prove those facts true. So our goal fact is
"Robert is Criminal," so following is the predicate of it.

Step-2:

At the second step, we will infer other facts form goal fact which satisfies the
rules. So as we can see in Rule-1, the goal predicate Criminal (Robert) is
present with substitution {Robert/P}. So we will add all the conjunctive facts
below the first level and will replace p with Robert.

Here we can see American (Robert) is a fact, so it is proved here.


Advertisement

Step-3:t At step-3, we will extract further fact Missile(q) which infer from
Weapon(q), as it satisfies Rule-(5). Weapon (q) is also true with the
substitution of a constant T1 at q.

Step-4:
At step-4, we can infer facts Missile(T1) and Owns(A, T1) form Sells(Robert,
T1, r) which satisfies the Rule- 4, with the substitution of A in place of r. So
these two statements are proved here.

Step-5:

At step-5, we can infer the fact Enemy(A, America) from Hostile(A) which
satisfies Rule- 6. And hence all the statements are proved true using
backward chaining.
Difference between backward chaining
and forward chaining
Following is the difference between the forward chaining and backward
chaining:

o Forward chaining as the name suggests, start from the known


facts and move forward by applying inference rules to extract more
data, and it continues until it reaches to the goal, whereas
backward chaining starts from the goal, move backward by using
inference rules to determine the facts that satisfy the goal.
o Forward chaining is called a data-driven inference technique,
whereas backward chaining is called a goal-driven inference
technique.
o Forward chaining is known as the down-up approach, whereas
backward chaining is known as a top-down approach.
o Forward chaining uses breadth-first search strategy, whereas
backward chaining uses depth-first search strategy.
o Forward and backward chaining both applies Modus
ponens inference rule.
o Forward chaining can be used for tasks such as planning, design
process monitoring, diagnosis, and classification, whereas
backward chaining can be used for classification and diagnosis
tasks.
o Forward chaining can be like an exhaustive search, whereas
backward chaining tries to avoid the unnecessary path of
reasoning.
o In forward-chaining there can be various ASK questions from the
knowledge base, whereas in backward chaining there can be
fewer ASK questions.
o Forward chaining is slow as it checks for all the rules, whereas
backward chaining is fast as it checks few required rules only.

You might also like