0% found this document useful (0 votes)
9 views

Chapter10

Uploaded by

mnvrkrishnapriya
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Chapter10

Uploaded by

mnvrkrishnapriya
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 25

Learning Sets of Rules

• Sequential covering algorithms


• FOIL
• Induction as the inverse of deduction
• Inductive Logic Programming

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 1


ning
Learning Disjunctive Sets of Rules
Method 1: Learn decision tree, convert to rules
Method 2: Sequential covering algorithm
1. Learn one rule with high accuracy, any coverage
2. Remove positive examples covered by this rule
3. Repeat

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 2


ning
Sequential Covering Algorithm
SEQUENTIAL-COVERING(Target_attr,Attrs,Examples,Thresh)
Learned_rules  {}
Rule  LEARN-ONE-RULE(Target_attr,Attrs,Examples)
while PERFORMANCE(Rule,Examples) > Thresh do
– Learned_rules  Learned_rules + Rule
– Examples  Examples - {examples correctly classified
by Rule}
– Rule  LEARN-ONE-RULE(Target_attr,Attrs,Examples)
Learned_rules  sort Learned_rules according to
PERFORMANCE over Examples
return Learned_rules

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 3


ning
Learn-One-Rule
IF
THEN CoolCar=Yes

IF Type = SUV IF Doors = 4


THEN CoolCar=Yes THEN CoolCar=Yes

IF Type = Car
THEN CoolCar=Yes

IF Type = SUV AND IF Type = SUV AND


Doors = 2 Color = Red
IF Type = SUV AND
THEN CoolCar=Yes THEN CoolCar=Yes
Doors = 4
THEN CoolCar=Yes

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 4


ning
Covering Rules
Pos  positive Examples
Neg  negative Examples
while Pos do (Learn a New Rule)
NewRule  most general rule possible
NegExamplesCovered  Neg
while NegExamplesCovered do
Add a new literal to specialize NewRule
1. Candidate_literals  generate candidates
2. Best_literal  argmaxL candidate_literals
PERFORMANCE(SPECIALIZE-RULE(NewRule,L))
3. Add Best_literal to NewRule preconditions
4. NegExamplesCovered  subset of NegExamplesCovered that
satistifies NewRule preconditions
Learned_rules  Learned_rules + NewRule
Pos  Pos - {members of Pos covered by NewRule}
Return Learned_rules
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 5
ning
Subtleties: Learning One Rule
1. May use beam search
2. Easily generalize to multi-valued target functions
3. Choose evaluation function to guide search:
– Entropy (i.e., information gain)
– Sample accuracy: n
c
n
where nc = correct predictions,
n = all predictions
– m estimate: nc  mp
nm
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 6
ning
Variants of Rule Learning Programs
• Sequential or simultaneous covering of data?
• General  specific, or specific  general?
• Generate-and-test, or example-driven?
• Whether and how to post-prune?
• What statistical evaluation functions?

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 7


ning
Learning First Order Rules
Why do that?
• Can learn sets of rules such as
Ancestor(x,y)  Parent(x,y)
Ancestor(x,y)  Parent(x,z)  Ancestor(z,y)
• General purpose programming language
PROLOG: programs are sets of such rules

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 8


ning
First Order Rule for Classifying Web Pages
From (Slattery, 1997)

course(A) 
has-word(A,instructor),
NOT has-word(A,good),
link-from(A,B)
has-word(B,assignment),
NOT link-from(B,C)

Train: 31/31, Test 31/34


CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 9
ning
FOIL
FOIL(Target_predicate,Predicates,Examples)
Pos  positive Examples
Neg  negative Examples
while Pos do (Learn a New Rule)
NewRule  most general rule possible
NegExamplesCovered  Neg
while NegExamplesCovered do
Add a new literal to specialize NewRule
1. Candidate_literals  generate candidates
2. Best_literal  argmaxL candidate_literal FOIL_GAIN(L,NewRule)
3. Add Best_literal to NewRule preconditions
4. NegExamplesCovered  subset of NegExamplesCovered that
satistifies NewRule preconditions
Learned_rules  Learned_rules + NewRule
Pos  Pos - {members of Pos covered by NewRule}
Return Learned_rules
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 10
ning
Specializing Rules in FOIL
Learning rule: P(x1,x2,…,xk)  L1…Ln
Candidate specializations add new literal of form:
• Q(v1,…,vr), where at least one of the vi in the
created literal must already exist as a variable in
the rule
• Equal(xj,xk), where xj and xk are variables already
present in the rule
• The negation of either of the above forms of
literals

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 11


ning
Information Gain in FOIL
 p1 p0 
FOIL _ GAIN ( L, R ) t  log 2  log 2 
 p1  n1 p0  n0 
Where
• L is the candidate literal to add to rule R
• p0 = number of positive bindings of R
• n0 = number of negative bindings of R
• p1 = number of positive bindings of R+L
• n1 = number of negative bindings of R+L
• t is the number of positive bindings of R also covered by R+L
Note
• p0 is optimal number of bits to indicate the class of a
 log 2
positivep0 binding
n0 covered by R
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 12
ning
Induction as Inverted Deduction
Induction is finding h such that
(<xi,f(xi)>  D) B  h  xi |– f(xi)
where
• xi is the ith training instance
• f(xi) is the target function value for xi
• B is other background knowledge

So let’s design inductive algorithms by inverting


operators for automated deduction!

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 13


ning
Induction as Inverted Deduction
“pairs of people, <u,v> such that child of u is v,”

f(xi) : Child(Bob,Sharon)
xi : Male(Bob),Female(Sharon),Father(Sharon,Bob)
B : Parent(u,v)  Father(u,v)

What satisfies (<xi,f(xi)>  D) B  h  xi |– f(xi)?


h1 : Child(u,v)  Father(v,u)
h2 : Child(u,v)  Parent(v,u)
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 14
ning
Induction and Deduction
Induction is, in fact, the inverse operation of
deduction, and cannot be conceived to exist
without the corresponding operation, so that the
question of relative importance cannot arise. Who
thinks of asking whether addition or subtraction is
the more important process in arithmetic? But at
the same time much difference in difficulty may
exist between a direct and inverse operation; … it
must be allowed that inductive investigations are
of a far higher degree of difficulty and complexity
than any question of deduction … (Jevons, 1874)

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 15


ning
Induction as Inverted Deduction
We have mechanical deductive operators
F(A,B) = C, where A  B |– C

need inductive operators


O(B,D) = h where
(<xi,f(xi)>  D) B  h  xi |– f(xi)

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 16


ning
Induction as Inverted Deduction
Positives:
• Subsumes earlier idea of finding h that “fits” training data
• Domain theory B helps define meaning of “fit” the data
B  h  xi |– f(xi)
• Suggests algorithms that search H guided by B
Negatives:
• Doesn’t allow for noisy data. Consider
(<xi,f(xi)>  D) B  h  xi |– f(xi)
• First order logic gives a huge hypothesis space H
– overfitting…
– intractability of calculating all acceptable h’s
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 17
ning
Deduction: Resolution Rule
PL
¬LR
PR
1. Given initial clauses C1 and C2, find a literal L
from clause C1 such that ¬ L occurs in clause C2.
2. Form the resolvent C by including all literals from
C1 and C2, except for L and ¬ L. More precisely,
the set of literals occurring in the conclusion C is
C = (C1 - {L})  (C2 - {¬ L})
where  denotes set union, and “-” set difference.
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 18
ning
Inverting Resolution
C1: PassExam  ¬KnowMaterial C2: KnowMaterial  ¬Study

C: PassExam  ¬Study

C1: PassExam  ¬KnowMaterial C2: KnowMaterial  ¬Study

C: PassExam  ¬Study

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 19


ning
Inverted Resolution (Propositional)
1. Given initial clauses C1 and C, find a literal L that
occurs in clause C1, but not in clause C.
2. Form the second clause C2 by including the
following literals
C2 = (C - (C1 - {L}))  {¬ L}

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 20


ning
First Order Resolution
1. Find a literal L1 from clause C1 , literal L2 from
clause C2, and substitution  such that
L1 = ¬L2
2. Form the resolvent C by including all literals from
C1 and C2, except for L1 theta and ¬L2. More
precisely, the set of literals occuring in the
conclusion is
C = (C1 - {L1})  (C2 - {L2 })
Inverting:
C2 = (C - (C1 - {L1}) 1) 2-1 {¬L1 1 2 -1}
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 21
ning
Cigol
Father(Tom,Bob) GrandChild(y,x)  ¬Father(x,z)  ¬Father(z,y))
{Bob/y,Tom/z}

Father(Shannon,Tom) GrandChild(Bob,x)  ¬Father(x,Tom))

{Shannon/x}
GrandChild(Bob,Shannon)

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 22


ning
Progol
PROGOL: Reduce combinatorial explosion by
generating the most specific acceptable h
1. User specifies H by stating predicates, functions, and
forms of arguments allowed for each
2. PROGOL uses sequential covering algorithm.
For each <xi,f(xi)>
– Find most specific hypothesis hi s.t.
B  hi  xi |– f(xi)
actually, only considers k-step entailment
3. Conduct general-to-specific search bounded by
specific hypothesis hi, choosing hypothesis with
minimum description length
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 23
ning
Learning Rules Summary
• Rules: easy to understand
– Sequential covering algorithm
– generate one rule at a time
– general to specific - add antecedents
– specific to general - delete antecedents
– Q: how to evaluate/stop?

• First order logic and covering


– how to connect variables
– FOIL
CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 24
ning
Learning Rules Summary (cont)
• Induction as inverted deduction
– what background rule would allow deduction?
– resolution
– inverting resolution
– and first order logic
• Cigol, Progol

CS 5751 Machine Lear Chapter 10 Learning Sets of Rules 25


ning

You might also like