0% found this document useful (0 votes)
19 views

4.2 Learning Sets of Rules

Uploaded by

mravali262003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

4.2 Learning Sets of Rules

Uploaded by

mravali262003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Learning Sets of Rules

Mohammed Miskeen Ali


Introduction
• What are Rules? Rules are essentially "if-then" statements. They
describe conditions (the "if" part) that lead to a specific outcome (the
"then" part). Imagine a recipe - the ingredients and steps (conditions)
are the "if" part, and the finished dish (outcome) is the "then" part.
• Learning Rules from Data: In machine learning, computers can learn
these rules by analyzing data. The data acts as examples, and the
computer identifies patterns within these examples to create rules for
making predictions in new situations.
Sequential Covering Algorithms
These algorithms learn rules one at a time and iteratively remove the instances covered by each
rule from the dataset.The process continues until all instances are covered or a stopping
criterion is met.
Steps:
1. Initialize: Start with an empty set of rules.
2. Learn a Rule: Find a rule that covers a subset of the instances.
3. Remove Covered Instances: Remove instances covered by the learned rule from the
dataset.
4. Repeat: Repeat steps 2 and 3 until all instances are covered or another stopping criterion is
met.
This process is similar to building a decision tree, but instead of creating the entire tree at once,
you build it one branch (rule) at a time.
Example Algorithm: RIPPER (Repeated Incremental Pruning to Produce Error Reduction)
• A popular sequential covering algorithm.
Learning Rule Sets:Summary
Learning Rule Sets:
• The goal is to create a set of rules that collectively cover the entire dataset.
• Each rule should cover a distinct subset of the data.
• The rule set should be as small and accurate as possible.
Key Points:
Accuracy: Each rule should correctly classify most of the instances it
covers.
Coverage: The set of rules should cover all instances in the dataset.
Simplicity: The rule set should be as simple as possible, avoiding
unnecessary complexity.
Learning First-Order Rules
First-Order Rules:
• Learning First-Order Rules involves discovering rules that relate
Predicate.Quantifiers, variables and constants in a structured, relational
way.
• Unlike propositional logic, which deals with facts as individual,
disconnected pieces of data, first-order logic (FOL) allows for the
representation of relationships and more complex structures within data.
Example:
If parent(X, Y) and parent(Y, Z), then grandparent(X, Z).
This rule uses variables (‘X’, ‘Y’,’ Z’) and relations (‘parent’, ‘grandparent’)
to define more complex relationships.
Learning Sets of First-Order Rules: FOIL
FOIL (First-Order Inductive Learner): A well-known algorithm for learning first-order rules.
• FOIL learns rules in a top-down manner, starting with a general rule and specializing it by
adding condition until it sufficiently covers positive examples and excludes negative ones.
• The algorithm constructs rules for each target predicate (goal) and then refines these rules to
improve their accuracy and coverage.
Steps:
1.Initialize: Start with the most general rule (e.g., predict the majority class).
2.Specialize: Add conditions to the rule to improve accuracy.
3.Evaluate: Use a measure like information gain to evaluate the new rule.
4.Iterate: Continue specializing the rule until it meets a stopping criterion.
5.Remove Covered Instances: Remove instances covered by the rule and repeat the process.
Example: Learning a rule for determining if someone is a grandparent:
1.Start with: grandparent(X, Z).
2.Add conditions: parent(X, Y) and parent(Y, Z).
Induction as Inverted Deduction:
• Induction involves generalizing from specific instances to broader rules.
• It can be seen as the reverse of deduction, which applies general rules to
specific instances to make predictions.
Deduction:
Given a general rule and specific facts, derive specific conclusions.
Example: From "All humans are mortal" (general rule) and "Socrates is a
human" (specific fact), deduce "Socrates is mortal".
Induction:
Given specific instances, derive a general rule.
Example: From observing that "Socrates is mortal", "Plato is mortal", and
other humans are mortal, induce the rule "All humans are mortal".
Inverting Resolution:
• Resolution is a method used in logic and automated theorem proving to infer conclusions
from known facts and rules.
• Inverting resolution involves using resolution in reverse to derive general rules from specific
instances.
Resolution: Combines pairs of clauses to produce new clauses, ultimately aiming to derive a
contradiction or a desired conclusion.
Example: From "A or B" and "not A", resolve to "B".
We show that B must be true if A∨B is true and A is false.
Inverting Resolution:
Inverse resolution is the process of generating a more general hypothesis (clause) from specific
observations (clauses) by effectively reversing the steps of the resolution process. It is often
used to generate possible new rules or clauses that could explain observed data.

You might also like