Module 4_chapter 3
Module 4_chapter 3
MODULE 4
CHAPTER 3 - BASICS OF LEARNING THEORY
3.1 INTRODUCTION TO LEARNING AND ITS TYPES
Learning is a process by which one can acquire knowledge and construct new ideas or
concepts based on the experiences.
The standard definition of learning proposed by Tom Mitchell is that a program can
learn from E for the task T, and P improves with experience E.
There are two kinds of problems – well-posed and ill-posed. Computers can solve only
well-posed problems, as these have well-defined specifications and have the following
components inherent to it.
1. Class of learning tasks (T)
2. A measure of performance (P)
3. A source of experience (E)
Let x- input, χ-input space, Y –is the output space. Which is the set of all possible
outputs, that is yes/no,
Let D –dataset for n inputs.Consider, target function be: χ-> Y , that maps input to
output.
Objective: To pick a function, g: χ-> Y to appropriate hypothesis f.
Classic machines examine data inputs according to a predetermined set of rules, finding
patterns and relationships that can be used to generate predictions or choices. Support
vector machines, decision trees, and logistic regression are some of the most used
classical machine-learning techniques.
Adaptive ML is the next generation of traditional ML – the new, the improved, the better.
Even though traditional ML witnessed significant progress.
Learning Types
These questions are the basis of a field called ‘Computational Learning Theory’ or in
short
(COLT).
Hypothesis space is the set of all possible hypotheses that approximates the target
function f.
The subset of hypothesis space that is consistent with all-observed training instances is
called as Version Space.
There are two ways of learning the hypothesis, consistent with all training instances
from the large hypothesis space.
List-Then-Eliminate Algorithm