Week 1 (1)
Week 1 (1)
C
−1 1
G
−1
(a) True
(b) False
3. Suppose that we multiply the weight vector w by −1. Then the same points G and
C will be classified as?
4. Which of the following can be achieved using the perceptron algorithm in machine
learning?
(a) Grouping similar data points into clusters, such as organizing customers based
on purchasing behavior.
(b) Solving optimization problems, such as finding the maximum profit in a business
scenario.
(c) Classifying data, such as determining whether an email is spam or not.
(d) Finding the shortest path in a graph, such as determining the quickest route
between two cities.
6. We know from the lecture that the decision boundary learned by the perceptron is a
line in R2 . We also observed that it divides the entire space of R2 into two regions,
suppose that the input vector x ∈ R4 , then the perceptron decision boundary will
divide the whole R4 space into how many regions?
(a) It depends on whether the data points are linearly separable or not.
(b) 3
(c) 4
(d) 2
(e) 5
8. Consider
the following table, where x1 and x2 are features (packed into a single vector
x
x = 1 ) and y is a label:
x2
x1 x2 y
0 0 0
0 1 1
1 0 1
1 1 1
Suppose that the perceptron model is used to classify
the data points. Suppose
1
further that the weights w are initialized to w = . The following rule is used for
1
classification,
(
1 if wT x > 0
y=
0 if wT x ≤ 0
The perceptron learning algorithm is used to update the weight vector w. Then, how
many times the weight vector w will get updated during the entire training process?
(a) 0
(b) 1
(c) 2
(d) Not possible to determine
(a) 1
(b) 2
(c) 3
(d) 4
(e) 5
Correct Answer: (c)
Solution: suppose, we set θ = 4, then summing all the input never exceeds 3, therefore,
the neuron won’t fire, And suppose we set θ < 3 then it won’t satisfy the AND
operator.
−1
10. Consider points shown in the picture. The vector w = . As per this weight
1
vector, the Perceptron algorithm will predict which classes for the data points x1 and
x2 .
NOTE: (
1 if wT x > 0
y=
−1 if wT x ≤ 0
x1 (1.5, 2)
w
x
x2 (−2.5, −2)
(a) x1 = −1
(b) x1 = 1
(c) x2 = −1
(d) x2 = 1