UNIT - II
UNIT - II
Knowledge representation
Knowledge Representation in AI refers to the way in which artificial intelligence systems store,
organize, and utilize knowledge to solve complex problems. It is a crucial aspect of AI, enabling
machines to mimic human understanding and reasoning. Knowledge representation involves
the creation of data structures and models that can efficiently capture information about the
world, making it accessible and usable by AI algorithms for decision-making, inference, and
learning.
Object: The AI needs to know all the facts about the objects in our world domain. E.g., A
keyboard has keys, a guitar has strings, etc.
Events: The actions which occur in our world are called events.
Performance: It describes a behavior involving knowledge about how to do things.
Meta-knowledge: The knowledge about what we know is called meta-knowledge.
Facts: The things in the real world that are known and proven true.
Knowledge Base: A knowledge base in artificial intelligence aims to capture human
expert knowledge to support decision-making, problem-solving, and more.
Types of Knowledge in AI
1. Declarative Knowledge
Declarative knowledge refers to facts and information that describe the world, answering
the “what” type of questions.
Example: Knowing that Paris is the capital of France.
This knowledge is often stored in databases or knowledge bases and expressed in logical
statements, forming the foundation for more complex reasoning and problem-solving in AI
systems.
2. Procedural Knowledge
Procedural knowledge is the knowledge of how to perform tasks or processes, answering
the “how” type of questions.
Example: Steps to solve a mathematical problem or the procedure to start a car.
This knowledge is embedded in algorithms or control structures, enabling AI systems to
execute tasks, perform actions, and solve problems step-by-step.
3. Meta-Knowledge
Meta-knowledge is knowledge about knowledge, understanding which types of knowledge
to apply in different situations.
Example: Knowing when to use a specific algorithm based on the problem at hand.
Crucial for systems that need to adapt or optimize their performance, meta-knowledge
helps in selecting the most appropriate strategy or knowledge base for a given problem.
4. Heuristic Knowledge
Heuristic knowledge includes rules of thumb, educated guesses, and intuitive judgments
derived from experience.
Example: Using an educated guess to approximate a solution when time is limited.
Often used in problem-solving and decision-making processes where exact solutions are
not feasible, helping AI systems to arrive at good-enough solutions quickly.
5. Structural Knowledge
Structural knowledge refers to the understanding of how different pieces of knowledge are
organized and related to each other.
Example: Understanding the hierarchy of concepts in a taxonomy or the relationships
between different entities in a semantic network.
This knowledge is essential for organizing information within AI systems, allowing for
efficient retrieval, reasoning, and inferencing based on the relationships and structures
defined.
Approaches to Knowledge Representation in AI
There are mainly four approaches to knowledge representation, which are given below:
It is the simplest way of storing facts which uses the relational method, and each fact
about a set of the object is set out systematically in columns.
This approach of knowledge representation is famous in database systems where the
relationship between different entities is represented.
This approach has little opportunity for inference.
Example: The following is the simple relational knowledge representation.
2. Inheritable knowledge:
In the inheritable knowledge approach, all data must be stored into a hierarchy of
classes.
All classes should be arranged in a generalized form or a hierarchal manner.
In this approach, we apply inheritance property.
Elements inherit values from other members of a class.
This approach contains inheritable knowledge which shows a relation between instance
and class, and it is called instance relation.
Every individual frame can represent the collection of attributes and its value.
In this approach, objects and values are represented in Boxed nodes.
We use Arrows which point from objects to their values.
Example:
3. Inferential knowledge:
Marcus is a man
All men are mortal
Then it can represent as;
man(Marcus)
∀x = man (x) ----------> mortal (x)s
4. Procedural knowledge:
Procedural knowledge approach uses small programs and codes which describes how to do
specific things, and how to proceed.
In this approach, one important rule is used which is If-Then rule.
In this knowledge, we can use various coding languages such as LISP language and Prolog
language.
We can easily represent heuristic or domain-specific knowledge using this approach.
But it is not necessary that we can represent all cases in this approach.
1. 1. Representational Accuracy:
KR system should have the ability to represent all kind of required knowledge.
2. 2. Inferential Adequacy:
KR system should have ability to manipulate the representational structures to
produce new knowledge corresponding to existing structure.
3. 3. Inferential Efficiency:
The ability to direct the inferential knowledge mechanism into the most productive
directions by storing appropriate guides.
4. 4. Acquisitional efficiency- The ability to acquire the new knowledge easily using
automatic methods.
Key Techniques in Knowledge Representation
1. First-Order Logic (FOL)
First-Order Logic is a formal system used in mathematics, philosophy, and computer science to
represent and reason about propositions involving objects, their properties, and their
relationships. Unlike propositional logic, FOL allows the use of quantifiers (like “forall” and
“exists”) to express more complex statements.
FOL is widely used in AI for knowledge representation and reasoning because it allows for
expressing general rules and facts about the world. For example, FOL can be used to represent
statements like “All humans are mortal” and “Socrates is a human,” enabling AI systems to infer
that “Socrates is mortal.” It provides a powerful and flexible framework for representing
structured knowledge and supports various forms of logical reasoning.
2. Fuzzy Logic
Fuzzy Logic is an approach to knowledge representation that deals with reasoning that is
approximate rather than exact. It allows for the representation of concepts that are not black
and white, but rather fall along a continuum, with degrees of truth ranging from 0 to 1.
Fuzzy Logic is particularly useful in domains where precise information is unavailable or
impractical, such as control systems, decision-making, and natural language processing. For
example, in a climate control system, fuzzy logic can be used to represent concepts like
“warm,” “hot,” or “cold,” and make decisions based on the degree to which these conditions
are met, rather than relying on strict numerical thresholds.
3. Description Logics
Description Logics are a family of formal knowledge representation languages used to describe
and reason about the concepts and relationships within a domain. They are more expressive
than propositional logic but less complex than full first-order logic, making them well-suited for
representing structured knowledge.
Description Logics form the foundation of ontologies used in the Semantic Web and are key to
building knowledge-based systems that require classification, consistency checking, and
inferencing. For example, they can be used to define and categorize different types of products
in an e-commerce system, allowing for automated reasoning about product features,
relationships, and hierarchies.
Example:
1. a) It is Sunday.
2. b) The Sun rises from West (False proposition)
3. c) 3+3= 7(False proposition)
4. d) 5 is a prime number.
Following are some basic facts about propositional logic:
1. Atomic Propositions
2. Compound propositions
Logical Connectives:
Logical connectives are used to connect two simpler propositions or
representing a sentence logically. We can create compound propositions with
the help of logical connectives. There are mainly five connectives, which are
given as follows:
1. Negation: A sentence such as ¬ P is called negation of P. A literal can be
either Positive literal or negative literal.
2. Conjunction: A sentence which has ∧ connective such as, P ∧ Q is called a
conjunction.
Example: Rohan is intelligent and hardworking. It can be written as,
P= Rohan is intelligent,
Q= Rohan is hardworking. → P∧ Q.
3. Disjunction: A sentence which has ∨ connective, such as P ∨ Q. is called
disjunction, where P and Q are the propositions.
Example: "Ritika is a doctor or Engineer",
Here P= Ritika is Doctor. Q= Ritika is Doctor, so we can write it as P ∨ Q.
4. Implication: A sentence such as P → Q, is called an implication. Implications
are also known as if-then rules. It can be represented as
If it is raining, then the street is wet.
Let P= It is raining, and Q= Street is wet, so it is represented as P → Q
5. Biconditional: A sentence such as P⇔ Q is a Biconditional sentence,
example If I am breathing, then I am alive
P= I am breathing, Q= I am alive, it can be represented as P ⇔ Q.
Truth Table:
In propositional logic, we need to know the truth values of propositions in all
possible scenarios. We can combine all the possible combination with logical
connectives, and the representation of these combinations in a tabular format
is called Truth table. Following are the truth table for all logical connectives:
Truth table with three propositions:
We can build a proposition composing three propositions P, Q, and R. This
truth table is made-up of 8n Tuples as we have taken three proposition
symbols.
Precedence of connectives:
Just like arithmetic operators, there is a precedence order for propositional
connectors or logical operators. This order should be followed while
evaluating a propositional problem. Following is the list of the precedence
order for operators:
Precedence Operators
Note: For better understanding use parenthesis to make sure of the correct
interpretations. Such as ¬R∨ Q, It can be interpreted as (¬R) ∨ Q.
Logical equivalence:
Logical equivalence is one of the features of propositional logic. Two
propositions are said to be logically equivalent if and only if the columns in the
truth table are identical to each other.
Let's take two propositions A and B, so for logical equivalence, we can write it
as A⇔B. In below truth table we can see that column for ¬A∨ B and A→B,
are identical hence A is Equivalent to B
Properties of Operators:
o Commutativity:
o P∧ Q= Q ∧ P, or
o P ∨ Q = Q ∨ P.
o Associativity:
o (P ∧ Q) ∧ R= P ∧ (Q ∧ R),
o (P ∨ Q) ∨ R= P ∨ (Q ∨ R)
o Identity element:
o P ∧ True = P,
o P ∨ True= True.
o Distributive:
o P∧ (Q ∨ R) = (P ∧ Q) ∨ (P ∧ R).
o P ∨ (Q ∧ R) = (P ∨ Q) ∧ (P ∨ R).
o DE Morgan's Law:
o ¬ (P ∧ Q) = (¬P) ∨ (¬Q)
o ¬ (P ∨ Q) = (¬ P) ∧ (¬Q).
o Double-negation elimination:
o ¬ (¬P) = P.
The syntax of FOL determines which collection of symbols is a logical expression in first-order
logic. The basic syntactic elements of first-order logic are symbols. We write statements in short-
hand notation in FOL.
o Atomic sentences are the most basic sentences of first-order logic. These sentences are
formed from a predicate symbol followed by a parenthesis with a sequence of terms.
o We can represent atomic sentences as Predicate (term1, term2, ......, term n).
Complex Sentences:
Consider the statement: "x is an integer.", it consists of two parts, the first part x is the subject of
the statement and second part "is an integer," is known as a predicate.
Quantifiers in First-order logic:
Universal Quantifier:
Universal quantifier is a symbol of logical representation, which specifies that the statement
within its range is true for everything or every instance of a particular thing.
o For all x
o For each x
o For every x.
Example:
It will be read as: There are all x where x is a man who drink coffee.
Existential Quantifier:
Existential quantifiers are the type of quantifiers, which express that the statement within its
scope is true for at least one instance of something.
It is denoted by the logical operator ∃, which resembles as inverted E. When it is used with a
predicate variable then it is called as an existential quantifier.
If x is a variable, then existential quantifier will be ∃x or ∃(x). And it will be read as:
Advertisement
Example:
It will be read as: There are some x where x is a boy who is intelligent.
Points to remember:
Properties of Quantifiers:
The quantifiers interact with variables which appear in a suitable way. There are two types of
variables in First-order logic which are given below:
Free Variable: A variable is said to be a free variable in a formula if it occurs outside the scope of
the quantifier.
Example: ∀x ∃(y)[P (x, y, z)], where z is a free variable.
Bound Variable: A variable is said to be a bound variable in a formula if it occurs within the scope
of the quantifier.
Substitution:
[ Note: first-order logic can convey facts about some or all of the
universe's objects. ]
Equality:
In First-Order Logic, atomic sentences are formed not only via the use of
predicate and words, but also through the application of equality. We
can do this by using equality symbols, which indicate that the two
terms relate to the same thing.
In the above example, the object referred by the Brother (John) is close
to the object referred by Smith. The equality symbol can be used with
negation to portray that two terms are not the same objects.
Universal Generalization
Universal Instantiation
Existential Instantiation
Existential introduction
1. Universal Generalization:
Example: Let's represent, P(c): "A byte contains 8 bits", so "All bytes
contain 8 bits."for ∀ x P(x) , it will also be true.
2. Universal Instantiation:
3. Existential Instantiation:
Example: 1
4. Existential introduction
Modus Ponens states that for atomic phrases pi, pi', q. Where there is a
substitution θ such that SUBST (θ, pi',) = SUBST(θ, pi), it can be
represented as:
Example: We will use this rule for Kings are evil, so we will find
some x such that x is king, and x is greedy so we can infer that x is
evil.
Unification
Unification Algorithm:
Algorithm: Unify(Ψ1, Ψ2)
Implementation of the Algorithm
SUBST θ = {f(a) / X}
S 1 => Ψ1 = p(f(a), g(Y)), and Ψ2 = p(f(a), f(a))
SUBST θ= {b/y}
SUBST θ={f(Y) /X}
Note: Firstly learn the FOL in AI, to better understand this topic.
The propositional rule is just a lifted version of the resolution rule for
first-order logic. If two clauses include complementary literals that are
expected to be standardized apart so that they share no variables,
resolution can resolve them.
These literals could be unified with unifier θ= [a/f(x), and b/x] , and it
will bring about a resolvent clause:
Example:
In this stage, we'll use a resolution tree and substitution to solve the
problem. It will be as follows for the aforesaid problem:
Example
Output is:
[1, 2, 3, 4, 5]
Declarative Knowledge:
Declarative Knowledge also known as Descriptive knowledge,
is the type of knowledge which tells the basic knowledge about
something and it is more popular than Procedural Knowledge.
It emphasize what to do something to solve a given problem.
Let’s see it with an example:
var a=[1, 2, 3, 4, 5];
var b=a.map(function(number)
{
return number*1});
console.log(b);
Output is:
[1, 2, 3, 4, 5]
Procedural Knowledge means how a particular While Declarative Knowledge means basic
2.
thing can be accomplished. knowledge about something.
Inference engine:
The inference engine is the component of the intelligent system in artificial
intelligence, which applies logical rules to the knowledge base to infer new
information from known facts. The first inference engine was part of the
expert system. Inference engine commonly proceeds in two modes, which
are:
1. Forward chaining
2. Backward chaining
Horn clause and definite clause are the forms of sentences, which enables
knowledge base to use a more restricted and efficient inference algorithm.
Logical inference algorithms use forward and backward chaining approaches,
which require KB in the form of the first-order definite clause.
It is equivalent to p ∧ q → k.
Forward Chaining
Forward chaining is also known as a forward deduction or forward reasoning
method when using an inference engine. Forward chaining is a form of
reasoning which start with atomic sentences in the knowledge base and
applies inference rules (Modus Ponens) in the forward direction to extract
more data until a goal is reached.
The Forward-chaining algorithm starts from known facts, triggers all rules
whose premises are satisfied, and add their conclusion to the known facts.
This process repeats until the problem is solved.
Properties of Forward-Chaining:
Example:
"As per the law, it is a crime for an American to sell weapons to hostile
nations. Country A, an enemy of America, has some missiles, and all
the missiles were sold to it by Robert, who is an American citizen."
In the first step we will start with the known facts and will choose the
sentences which do not have implications, such as: American(Robert),
Enemy(A, America), Owns(A, T1), and Missile(T1). All these facts will be
represented as below.
Step-2:
At the second step, we will see those facts which infer from available facts
and with satisfied premises.
Rule-(1) does not satisfy premises, so it will not be added in the first iteration.
Step-3:
B. Backward Chaining:
Backward-chaining is also known as a backward deduction or backward
reasoning method when using an inference engine. A backward chaining
algorithm is a form of reasoning, which starts with the goal and works
backward, chaining through rules to find known facts that support the goal.
Backward-Chaining proof:
In Backward chaining, we will start with our goal predicate, which
is Criminal(Robert), and then infer further rules.
Step-1:
At the first step, we will take the goal fact. And from the goal fact, we will infer
other facts, and at last, we will prove those facts true. So our goal fact is
"Robert is Criminal," so following is the predicate of it.
Step-2:
At the second step, we will infer other facts form goal fact which satisfies the
rules. So as we can see in Rule-1, the goal predicate Criminal (Robert) is
present with substitution {Robert/P}. So we will add all the conjunctive facts
below the first level and will replace p with Robert.
Step-3:t At step-3, we will extract further fact Missile(q) which infer from
Weapon(q), as it satisfies Rule-(5). Weapon (q) is also true with the
substitution of a constant T1 at q.
Step-4:
At step-4, we can infer facts Missile(T1) and Owns(A, T1) form Sells(Robert,
T1, r) which satisfies the Rule- 4, with the substitution of A in place of r. So
these two statements are proved here.
Step-5:
At step-5, we can infer the fact Enemy(A, America) from Hostile(A) which
satisfies Rule- 6. And hence all the statements are proved true using
backward chaining.
Difference between backward chaining
and forward chaining
Following is the difference between the forward chaining and backward
chaining: