0% found this document useful (0 votes)
8 views

Iv Vii: 20CSPC702

Uploaded by

Vidya S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Iv Vii: 20CSPC702

Uploaded by

Vidya S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

SUBJECT CODE

TYPE THE SUBJECT NAME HERE

UNIT-V

ADVANCED LEARNING

5.2.1 First Order Rules

IV VII

20CSPC702
MACHINE LEARNING TECHNIQUES
CS8082
MACHINE LEARNING TECHNIQUES

Learning Rule Sets

Design Space Dimensions


Rule Performance Measures
Learning First-order Rules (First-order Horn Clauses and Terminologies)
CS8082
MACHINE LEARNING TECHNIQUES

Design Space Dimensions

(i) Sequential Vs Simultaneous :


● Sequential Covering Algorithms learn one rule at a time, remove the covered
examples, and repeat.
● Decision trees can be seen as Simultaneous Covering Algorithms

● Sequential covering algorithms perform n*k primitive search steps to learn n rules
each containing k attribute- value tests. If the decision trees is a complete binary
tree, it makes (n-1) primitive search steps where n is the number of paths (i.e.,
rules).
● So Sequential Covering Algorithms must be supported by additional data, but have
the advantage of allowing rules with different tests.
CS8082
MACHINE LEARNING TECHNIQUES

Design Space Dimensions

(ii) General-to-Specific Vs Specific-to-General :

● General to specific starts at the one maximally general hypothesis

● In specific to general there are many maximally specific hypothesis (the training
data).
● Golem chooses several randomly and picks the best learned hypothesis.
CS8082
MACHINE LEARNING TECHNIQUES

Design Space Dimensions

(iii) Generate-then-test or Example-driven :

● GTT hypothesis performance is based on many training examples.

● the effect of noisy data is minimized.

(iv) Post-pruning :

● In either system post-pruning can be used to increase the effectiveness of rules on


a validation set
CS8082
MACHINE LEARNING TECHNIQUES

Rule Performance Measures

● Relative frequency :

AQ -

n number of examples the rule matches

nc number of examples that it classifies correctly


CS8082
MACHINE LEARNING TECHNIQUES

Rule Performance Measures

● m-estimate of accuracy :

CN2 -

p prior probability that a randomly drawn example from the entire data set
will have the classification assigned by the rule
m weight, or equivalent number of examples for weighting this prior p
CS8082
MACHINE LEARNING TECHNIQUES

Rule Performance Measures

● Entropy :
CN2 -

c number of distinct values the target function may take on


pi proportion of examples from S for which the target function takes on the
ith value
CS8082
MACHINE LEARNING TECHNIQUES

Learning First Order Rules

● Inductive Logic Programming (ILP)

● Automatically inferring Prolog programs from examples

● Why not Propositional Rules?


Because they are not much expressive as Horn theories.
CS8082
MACHINE LEARNING TECHNIQUES

Learning First Order Rules – First-Order Horn Clauses

Example :

Target concept : Daughter (x, y), defined over pairs of people x and y
The value of Daughter(x, y) is True when x is the daughter of y, and False
otherwise.
Target attribute : Daughter

The subscript on each attribute name indicates which of the two persons is being
described.
CS8082
MACHINE LEARNING TECHNIQUES

Learning First Order Rules – First-Order Horn Clauses

To collect a number of such training examples for the target concept Daughter1,2 and
provide them to a propositional rule learner such as CN2 or C4.5,

The problem is that propositional representations offer no general way to describe the
essential relations among the values of the attributes.
Hence,

where x and y are variables that can be bound to any person.


CS8082
MACHINE LEARNING TECHNIQUES

Learning First Order Rules – First-Order Horn Clauses

whenever such a variable occurs only in the preconditions, it is assumed to be


existentially quantified; that is, the rule preconditions are satisfied as long as there
exists at least one binding of the variable that satisfies the corresponding literal.

IF Parents(x, z) ^ Ancestor(z, y)
THEN Ancestor(x, y)

It can also represent (and learn!) recursive functions.


CS8082
MACHINE LEARNING TECHNIQUES

Learning First Order Rules – Terminologies

Terminology I :
● Every well-formed expression is composed of constants (e.g., Mary, 23, or Joe),
variables (e.g., x), predicates (e.g., Female, as in Female(Mary)), and functions
(e.g., age is in age(Mary)).
● A term is any constant, any variable, or any function applied to any term.
Examples include (Mary, x, age(Mary), age(x)).
● A literal is any predicate (or its negation) applied to any set of terms. Examples
include (Female(Mary),¬Female(x), Greater than(age(Mary),20)).
● A ground literal is a literal that does not contain any variables (e.g.,
¬Female(Joe)).
CS8082
MACHINE LEARNING TECHNIQUES

Learning First Order Rules – Terminologies

Terminology II :
● A negative literal is a literal containing a negated predicate (e.g., ¬Female(Joe)).

● A positive literal is a literal eith no negation sign (e.g., Female(Mary)).

● A clause is any disjunction of literals M1v…Mn whose variables are universally


quantified.
● A Horn clause is an expression of the form H¬ (L1^…^Ln) where H, L1…Ln are
positive literals. H is called the head or consequent of the Horn clause. The
conjunction of literals L1 ^ L2^…^Ln is called the body or antecedents of the Horn
clause.


CS8082
MACHINE LEARNING TECHNIQUES

Learning First Order Rules – Terminologies

Terminology III :
● For any literals A and B, the expression (A¬B) is equivalent to (A v ¬B), and the
expression ¬(A ^ B) is equivalent to (¬A v ¬B). Therefore a Horn clause can
equivalently be written as the disjunction H v ¬L1 v … v ¬Ln.
● A substitution is any function that replaces variables by terms. For example, the
substitution {x/3, y/z} replaces the variable x by the term 3 and replaces the
variable y by the term z. Given a substitution q and a literal L we write Lq to denote
the result of applying substitution q to L.
● A unifying substitution for two literals L1 and L2 is any substitution q such that
L1q = L2q.

You might also like