Artificial Intelligence Knowledge & Reasoning
Knowledge Representation
Reference: Sections 10.1, 10.2, 10.6 & 10.7 of Textbook R&N
How to represent the real world aspect ?
What content to put in the Knowledge Base?
By Bal Krishna Subedi 1
Artificial Intelligence Knowledge & Reasoning
Knowledge Representation
There are various types of knowledge that need to be represented in
a computer:
- declarative and procedural knowledge,
- commonsense knowledge, scientific knowledge, mathematical knowledge.
There is always a relationship between the form in which
knowledge is represented and the way in which the knowledge is
used.
Domain-specific knowledge is often necessary for many AI tasks
logic-based representation is the foundation of various kinds of
knowledge representation.
By Bal Krishna Subedi 2
Artificial Intelligence Knowledge & Reasoning
What is a representation?
A representation consists of two sets and a mapping between them. The elements
of each set are objects, relations, classes, laws, actions. The first set is called the
represented domain, and the second one is called the representation domain.
This mapping allows the agent to reason about the represented domain by
performing reasoning processes in the representation domain, and transferring the
conclusions back into the represented domain.
ONTOLOGY
OBJECT
SUBCLASS-OF
CUP BOOK TABLE
represents
INSTANCE-OF
If an object is on top of
another object that is itself on CUP1 ON BOOK1 ON TABLE1
top of a third object then the RULE
first object is on top of the x,y,z OBJECT,
third object. (ON x y) & (ON y z) (ON x z)
Represented Domain By Bal Krishna Subedi Representation Domain 3
Artificial Intelligence Knowledge & Reasoning
What is an ontology
Every knowledge-based agent has a conceptualization or a model of
its world which consists of representations of the objects, concepts,
and other entities that are assumed to exist, and the relationships that
hold among them.
An ontology is a specification of the terms that are used to represent
the agent’s world.
Define the terminology about the objects and their relationships in a
systematic way
– closely related to taxonomies, classifications
• ontologies don’t have to be hierarchical
• emphasis on the terms to describe objects, relationships, not on the properties of
objects or specific relationships between objects
By Bal Krishna Subedi 4
Artificial Intelligence Knowledge & Reasoning
Why ontological Representation?
Enables an agent to communicate with other agents,
because they share a common vocabulary (terms) which
they both understand.
Enables knowledge sharing and reuse among agents.
– Ontological commitment:
Agreement among several agents to use a shared vocabulary in a
coherent and consistent manner.
By Bal Krishna Subedi 5
Artificial Intelligence Knowledge & Reasoning
Object ontology
We define an object ontology as a hierarchical description of the
objects specifying their properties and relationships
In the military domain the object ontology will include descriptions of military
units and of military equipment. These descriptions are most likely needed in
almost any specific military application.
Because building the object ontology is a very complex task, it makes sense to
reuse these descriptions when developing a knowledge base for another military
application, rather than starting from scratch.
It includes both descriptions of types of objects (called concepts)
and descriptions of specific objects (called instances).
By Bal Krishna Subedi 6
Artificial Intelligence Knowledge & Reasoning
Instances, concepts and generalization
A concept is a representation of a set of instances.
Represents the set of all birds
BIRD
(which includes CANARY &
OSTRICH)
INSTANCE-OF INSTANCE-OF
Represents the
CANARY OSTRICH entity called
OSTRICH
An instance is a representation of a particular entity in
the application domain.
By Bal Krishna Subedi 7
Artificial Intelligence Knowledge & Reasoning
Generalization
• Generalization is a fundamental relation between concepts.
• Intuitively, a concept P is said to be more general than (or
a generalization of) another concept Q if and only if the set
of instances represented by P includes the set of instances
represented by Q.
ANIMAL
BIRD CANAR
Y FISH
OSTRIC
H
Possible relationship between two concepts P and Q
- P is more general than Q
- Q is more general than P
- There is no generalization relationship between P and Q
By Bal Krishna Subedi 8
Artificial Intelligence Knowledge & Reasoning
Basic representation unit
conceptk ISA concepti
FEATURE1 value1
...
FEATUREn valuen
This is a necessary definition of ‘conceptk’.
It defines ‘conceptk’ as being a subconcept of ‘concepti’ and having
additional features.
This means that if ‘concepti’ represents the set Ci of instances, then
‘conceptk’ represents a subset Ck of Ci.
The elements of Ck have the features ‘FEATURE1’,..., ‘FEATUREn’ with the
values ‘value1’,..., ‘valuen’, respectively.
By Bal Krishna Subedi 9
Artificial Intelligence Knowledge & Reasoning
General features of a representation
• Representational adequacy
The ability to represent all of the kinds of knowledge that are needed in a
certain domain.
• Inferential adequacy
The ability to represent all of the kinds of inferential procedures (procedures
that manipulate the representational structures in such a way as to derive new
structures corresponding to new knowledge inferred from old).
• Inferential efficiency
The ability to represent efficient inference procedures (for instance, by
incorporating into the knowledge structure additional information that can be
used to focus the attention of the inference mechanisms in the most promising
directions).
• Acquisitional efficiency
The ability to acquire new information easily.
By Bal Krishna Subedi 10
Artificial Intelligence Knowledge & Reasoning
Representing knowledge in semantic
networks
The underlying idea of semantic networks is to represent knowledge
in the form of a graph in which the nodes represent objects, situations,
or events, and the arcs represent the relationships between them.
Mammals
SubsetOf
Legs
SubsetOf Persons 2
SubsetOf
Female Male
MemberOf MemberOf
HasHeight FriendOf
62 Mary John
By Bal Krishna Subedi 11
Artificial Intelligence Knowledge & Reasoning
Representing knowledge in semantic
networks
yellow
Example of a semantic
network representing color
knowledge about our solar mass Sun temperature
system.
MSun TSun
attracts
greater-than greater-than
revolves-
arround
3500 value MEarth TEarth
mass Earth temperature
By Bal Krishna Subedi 12
Artificial Intelligence Knowledge & Reasoning
Reasoning with semantic networks
The transitivity of the ISA and INSTANCE-OF
The relationships between ISA and INSTANCE-OF :
x y z INSTANCE-OF(x, y) & ISA(y, z) INSTANCE-OF(x, z)
that is, if x is an instance of y and y is a subconcept of z then x is an instance of z.
x y z ISA(x, y) & ISA(y, z) ISA(x, z)
i.e., if x is a subconcept of y and y is a subconcept of z then x is a subconcept of z.
Example: animal
isa Clyde is an instance of robin,
bird an instance of bird,
and an instance of animal
isa
robin robin is a subconcept (subclass) of bird,
and a subconcept of animal
instance-of
Clyde By Bal Krishna Subedi 13
Artificial Intelligence Knowledge & Reasoning
Reasoning with semantic networks
Inheritance
A knowledge which is implicitly represented in a semantic
network is the inheritance of properties from a more general
concept to a less general one.
For example:
ISA(ACETONE, SOLVENT) & REMOVES(SOLVENT, SURPLUS-
ADHESIVE) REMOVES (ACETONE, SURPLUS-ADHESIVE)
i.e. the fact that SOLVENT REMOVES SURPLUS-ADHESIVE and
ACETONE is a SOLVENT implies that ACETONE REMOVES SURPLUS-
ADHESIVE.
By Bal Krishna Subedi 14
Artificial Intelligence Knowledge & Reasoning
Reasoning with semantic networks
Multiple inheritance
An object (instance or concept) may inherit properties from several
super-concepts.
How can we deal with the inheritance of contradictory properties?
In such a case, the system would use some strategy in selecting one of the values.
The simplest strategy is to use the first value found.
A better solution is to detect such conflicts when the semantic network is built or
updated, and to directly associate the correct property value with each node that
would inherit conflicting values.
habitat cartoon character habitat funny papers
south pole penguin
instance-of instance-of
Opus
By Bal Krishna Subedi 15
Artificial Intelligence Knowledge & Reasoning
Reasoning with semantic networks
Default Inheritance
Properties associated with concepts in a hierarchy are assumed to be true of all
subclasses and instances.
How can we deal with exceptions (i.e. sub-concepts or instances that do not
have the inherited property)?
Explicitly override the inherited property.
animal
fly
mammal bird true
fly false
woodpecker wren ostrich
By Bal Krishna Subedi 16
Artificial Intelligence Knowledge & Reasoning
Reasoning with semantic networks
Network matching
A network fragment is constructed, representing a sought-for object or a
query, and then matched against the network database to see if such an
object exists.
Network matching allows to ask questions about the object in the
network.
Variable nodes in the fragment are bound in the matching process to the
values with which they match perfectly.
height
John 72 Question:
What is the height of John ?
Answer:
height The height of John is 72.
John x
By Bal Krishna Subedi 17
Artificial Intelligence Knowledge & Reasoning
Reasoning with semantic networks
Network matching: Example
bird
ownership
isa nest
robin
instance-of
instance-of instance-of
owner ownee
Clyde own1 nest-1
Network Database
Query: "What does Clyde own?"
We might construct the network which represents an instance of ownership in
which Clyde is the owner. This fragment is then matched against the network
database looking for an own node that has an owner link to Clyde.
When it is found, the node that the ownee link points to is bound in the partial
match and is the answer to the question.
And if no match been found, the answer would have been "Clyde does not own
anything".
By Bal Krishna Subedi 18
Artificial Intelligence Knowledge & Reasoning
Reasoning with semantic networks
Network matching: Example
ownership
instance-of
Clyde owner own-? ownee
?
A semantic network representing the question "What does Clyde own ?".
bird ownership nest
instance-of instance-of instance-of
owner ownee
bird-? own-? nest-?
A semantic network representing the question "Is there a bird who owns a nest?".
Here, bird-?, nest-?, and own-? nodes represent the yet to be determined bird-
owning-nest relation.
Answer to the question would be "Yes, Clyde"
By Bal Krishna Subedi 19
Artificial Intelligence Knowledge & Reasoning
Conceptual Graphs
A finite, connected, bipartite graph.
The nodes of graph are either concepts or conceptual relations (no
leveled arcs).
Conceptual relation nodes indicate a relation involving one or more
concepts.
A relation of arity n is represented by a conceptual relation node
having n arcs.
The types and individual labels are separated by colon.
# to represent unnamed individual.
Graphs may be arbitrarily complex but must be finite.
A typical knowledge base contain a number of such graphs.
By Bal Krishna Subedi 20
Artificial Intelligence Knowledge & Reasoning
Conceptual Graphs
Proposition: “A dog has a color of brown”
dog color brown
Proposition: “Mary gave John the book”
Person: Mary agent give Object
Person: John recipient book
Proposition: “A dog named emma is brown”
Dog: #1352 color brown
name emma
By Bal Krishna Subedi 21
Artificial Intelligence Knowledge & Reasoning
Conceptual Graphs
We can use generic marker * to indicate an unspecified individual.
dog ~ dog:*.
Generic marker allows the use of named variables. dog: *X.
Proposition: “The dog scratches its ear with its paw”.
dog:*X agent scratch object ear
instrument part
paw part dog:*X
By Bal Krishna Subedi 22
Artificial Intelligence Knowledge & Reasoning
Conceptual Graphs & Logic
Conjunction – easy to represent in conceptual graph.
Negation – represented by using a “neg” unary operation.
Disjunction - Using the negation and conjunction we may form
a graphs that represent disjunction
In conceptual graphs, generic concepts are assumed to be
existential quantified.
Using the negation and existential quantification we can
represent universal quantification.
By Bal Krishna Subedi 23
Artificial Intelligence Knowledge & Reasoning
Conceptual Graphs & Logic
Proposition: “There are no pinks dogs”
dog color pink
neg
This graph corresponds to the logical expression
X Y (dog(X) color(Y) pink(Y)) is equivalent to
X Y (¬(dog(X) color(Y) pink(Y))).
By Bal Krishna Subedi 24
Artificial Intelligence Knowledge & Reasoning
Home Works
1. 10.5, 10.7, 10.22 & 10.23.
2. Represent the following sentences into a semantic network.
Birds are animals.
Birds have feathers, fly and lay eggs.
Albatros is a bird.
Donald is a bird.
Tracy is an albatros
3. Consider the following network fragment:
Explain how a semantic network system would answer the questions:
Roman person
isa isa
Pompeian man
ruler
instance-of instance-of
instance-of
height tryassassinate
72 Marcus Caesar
What is the height of Marcus?
Is there a person who tried to assassinate Caesar?
By Bal Krishna Subedi 25
Artificial Intelligence Knowledge & Reasoning
Home Works
4. Translate each of the following sentences into predicate
calculus and conceptual graphs:
“Jane gave Tom an ice cream cone”
“Basketball player are tall”
“Paul cut down the tree with an axe”
“Place all the ingredients in a bowl and mix thoroughly”
5. Translate the following graphs into predicate calculus.
Person: John agent eat
object soup
instrument
part hand
By Bal Krishna Subedi 26