0% found this document useful (0 votes)
1K views

DL - Assignment 2 Solution

This document contains a 10 question multiple choice quiz on deep learning concepts. The questions cover topics like discriminant functions, class conditional probability density functions, computing mean vectors and covariance matrices from data points, properties of covariance matrices, and determining decision boundaries for Bayesian classifiers. For each question, the correct answer and a detailed solution explaining the reasoning is provided.

Uploaded by

swathisreejith6
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1K views

DL - Assignment 2 Solution

This document contains a 10 question multiple choice quiz on deep learning concepts. The questions cover topics like discriminant functions, class conditional probability density functions, computing mean vectors and covariance matrices from data points, properties of covariance matrices, and determining decision boundaries for Bayesian classifiers. For each question, the correct answer and a detailed solution explaining the reasoning is provided.

Uploaded by

swathisreejith6
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

NPTEL Online Certification Courses

Indian Institute of Technology Kharagpur

Deep Learning
Assignment- Week 2
TYPE OF QUESTION: MCQ/MSQ
Number of questions: 10 Total mark: 10 X 2 = 20
______________________________________________________________________________

QUESTION 1:

Choose the correct option regarding discriminant functions gi(x) for multiclass classification (x is the
feature vector to be classified).

Statement i : Risk value R (αi|x) in Bayes minimum risk classifier can be used as a discriminant function.
Statement ii : Negative of Risk value R (αi|x) in Bayes minimum risk classifier can be used as a
discriminant function.
Statement iii : Aposteriori probability P(ωi|x) in Bayes minimum error classifier can be used as a
discriminant function.
Statement iv : Negative of Aposteriori probability P(ωi|x) in Bayes minimum error classifier can be used
as a discriminant function.

a. Only Statement i is true


b. Both Statements ii and iii are true
c. Both Statements i and iv are true
d. Both Statements ii and iv are true

Correct Answer: b

Detailed Solution:

Follow the Lecture 06 for detailed explanation

QUESTION 2:

Which of the following is regarding functions of discriminant functions gi(x) i.e., f(gi(x))

a. We can not use functions of discriminant functions f(gi(x)), as discriminant functions for
multiclass classification.
b. We can use functions of discriminant functions, f(gi(x)), as discriminant functions for
multiclass classification provided, they are constant functions i.e., f(gi(x)) = C where C is a
constant.
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

c. We can use functions of discriminant functions, f(gi(x)), as discriminant functions for


multiclass classification provided, they are monotonically increasing functions.
d. None of the above is true.

Correct Answer: c

Detailed Solution:

Follow the Lecture 06 for detailed explanation.

QUESTION 3:
The class conditional probability density function for the class ωi, i.e., P(x|ωi) for a multivariate
normal (or Gaussian) distribution (where x is a d dimensional feature vector) is given by
1 1
a. 𝑝(𝑥|ω𝑖 ) = (2π)𝑑/2 |Σ |1/2 exp⁡(− 2 (𝑥 − 𝜇𝑖 )𝑇 Σ𝑖 −1 (𝑥 − 𝜇𝑖 ))
𝑖
1 1
b. 𝑝(𝑥|ω𝑖 ) = (2π)𝑑/2 exp (− 2 (𝑥 − 𝜇𝑖 )𝑇 Σ𝑖 −1 (𝑥 − 𝜇𝑖 ))
1 1
c. 𝑝(𝑥|ω𝑖 ) = (2π)𝑑/2 exp⁡(− 2 (𝑥 − 𝜇𝑖 )𝑇 (𝑥 − 𝜇𝑖 ))
d. None of the above

Correct Answer: a

Detailed Solution:

Refer to the lecture videos.

QUESTION 4:
There are some data points for two different classes given below.
Class 1 points: {(2, 6), (3, 4), (3, 8), (4, 6)}
Class 2 points: {(3, 0), (1, −2), (5, −2), (3, −4)}
Compute the mean vectors 𝜇1 and 𝜇2 for these two classes and choose the correct option.
a. 𝜇1 = [2] and 𝜇2 = [ 3 ]
6 −1
b. 𝜇1 = [ ] and 𝜇2 = [ 2 ]
3
6 −2
c. 𝜇1 = [ ] and 𝜇2 = [ 3 ]
3
6 −2
d. 𝜇1 = [ ] and 𝜇2 = [ 2 ]
3
5 −3
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

Correct Answer: c

Detailed Solution:

Add the points for each class and divide the output by number of points i.e., 4

QUESTION 5:
There are some data points for two different classes given below.
Class 1 points: {(2, 6), (3, 4), (3, 8), (4, 6)}
Class 2 points: {(3, 0), (1, −2), (5, −2), (3, −4)}
Compute the covariance matrices Σ1 and Σ2 and choose the correct option.

a. ⁡Σ1 = [2 0] and Σ2 = [2 0
]⁡
0 2 0 0.5
b. ⁡Σ1 = [0.5 0] and Σ2 = [2 0]
0 2 0 2
1 0 1 0
c. ⁡Σ1 = [ ] and Σ2 = [ ]
0 1 0 1
d. ⁡Σ1 = [0.5 0 ] and Σ2 = [1 0]
0 0.5 0 1

Detailed Solution:

Follow the steps mentioned in the lecture video.

QUESTION 6:
There are some data points for two different classes given below.
Class 1 points: {(2, 6), (3, 4), (3, 8), (4, 6)}
Class 2 points: {(3, 0), (1, −2), (5, −2), (3, −4)}
What will be the value expression of decision boundary between these two classes if both the class has
𝑥1
equal class probability 0.5? For the input sample 𝑥 = [𝑥 ] consider 𝑔𝑖 (𝑥) = 𝑥 𝑡 𝐴𝑖 𝑥 + 𝐵𝑖𝑡 𝑥 + 𝐶, where
2
1 1 1
𝐴 = ⁡ − 2 Σ𝑖−1 , 𝐵 = ⁡Σ𝑖−1 𝜇𝑖 , 𝐶 = ⁡ − 2 𝜇𝑖𝑡 Σ𝑖−1 𝜇𝑖 − 2 ln|Σ𝑖 | + ln|𝑃(𝜔𝑖 )|

a. 𝑥1 = 0.09𝑥22 − 1.12𝑥2 + ⁡0.5


b. 𝑥2 = 0.19𝑥12 − 1.12𝑥1 + ⁡3.5
c. 𝑥1 = 0.09𝑥12 − 1.12𝑥2 + ⁡0.5
d. 𝑥2 = 0.19𝑥22 − 1.12𝑥1 + ⁡3.5

Correct Answer: b

Detailed Solution:
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

This is the most general case of discriminant function for normal density. The inverse
matrices are

2 0 1/2 0
Σ1 −1 = [ ] , 𝑎𝑛𝑑⁡⁡Σ2 −1 = [ ]
0 1/2 0 1/2

a. Setting 𝑔1 (𝑥) = 𝑔2 (𝑥) we get the decision boundary as 𝑥2 = 0.19𝑥12 − 1.12𝑥1 + ⁡3.5

QUESTION 7:

Let Σ𝑖 ⁡represents the covariance matrix for ith class. Assume that the classes have the same co-variance
matrix. Also assume that the features are statistically independent and have same co-variance. Which of
the following is true?

a. Σ𝑖 = Σ, (diagonal elements of Σ⁡are zero)

b. Σ𝑖 = Σ, (diagonal elements of Σ are non-zero and different from each other, rest of the elements
are zero)

c. Σ𝑖 = Σ, (diagonal elements of Σ are non-zero and equal to each other, rest of the elements are
zero)

d. None of these

Correct Answer: c.

Detailed Solution:

Let, .

If classes have the same co-variance matrix and features are statistically independent and
have same co-variance , then diagonal elements are 𝝈𝟐 and rest of the elements are 0

Hence option c is correct


NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

QUESTION 8:
The decision surface between two normally distributed class ω1 and ω2 is shown on the figure.
Can you comment which of the following is true?

a. 𝑝(𝜔1 ) = 𝑝(𝜔2 )

b. 𝑝(𝜔2 ) > 𝑝(𝜔1 )

c. 𝑝(𝜔1 ) > 𝑝(𝜔2 )

d. None of the above.

Correct Answer: c

Detailed Solution:

If the prior probabilities are not equal, the optimal boundary hyperplane is shifted away
from the more likely mean.

QUESTION 9:
You are given some data points for two different class.
Class 1 points: {(11, 11), (13, 11), (8, 10), (9, 9), (7, 7), (7, 5), (15, 3)}
Class 2 points: {(7, 11), (15, 9), (15, 7), (13, 5), (14, 4), (9, 3), (11, 3)}
Compute the covariance matrices and choose the correct option.

a. ⁡Σ1 = [1 0 ] and ⁡Σ = [ 3.65 −1.0]


2
0 5.65 −1.0 3.65
b. Σ1 = [1 0 ] and Σ2 = [
1 0]
0 1 0 1
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

c. Σ1 = [ 3.65 −1.0] and Σ2 = [ 9.67 −1.0]


−1.0 3.65 −1.0 9.67
d. Σ1 = [ 8.29 −0.85 ]⁡and Σ2 = [ 8.29 −0.85]
−0.85 8.29 −0.85 8.29

Correct Answer: d

Detailed Solution:

𝟏𝟎 𝟏𝟐 𝟖. 𝟐𝟗 −𝟎. 𝟖𝟓
𝝁𝟏 = [ ], ⁡𝝁𝟐 = [ ] , 𝚺𝟏 = 𝚺𝟐 = 𝚺 = [ ]
𝟖 𝟔 −𝟎. 𝟖𝟓 𝟖. 𝟐𝟗

QUESTION 10:
You are given some data points for two different class.
Class 1 points: {(11, 11), (13, 11), (8, 10), (9, 9), (7, 7), (7, 5), (15, 3)}
Class 2 points: {(7, 11), (15, 9), (15, 7), (13, 5), (14, 4), (9, 3), (11, 3)}
Assume that the points are samples from normal distribution and a two class Bayesian classifier
is used to classify them. Also assume the prior probability of the classes are equal i.e.,
𝑝(𝜔1 ) = 𝑝(𝜔2 )
Which of the following is true about the corresponding decision boundary used in the classifier?
(Choose correct option regarding the given statements)

Statement i: Decision boundary passes through the midpoint of the line segment joining the
means of two classes
Statement ii: Decision boundary will be orthogonal bisector of the line joining the means of two
classes.

a. Only Statement i is true

b. Only Statement ii is true

c. Both Statement i and ii are true

d. None of the statements are true

Correct Answer: a

Detailed Solution:
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

𝟖. 𝟐𝟗 −𝟎. 𝟖𝟓
𝚺𝟏 = 𝚺𝟐 = 𝚺 = [ ] but⁡𝚺 is not an identity matrix So, only option a is correct.
−𝟎. 𝟖𝟓 𝟖. 𝟐𝟗

You might also like