0% found this document useful (0 votes)
44 views9 pages

Endsem ML All Pyq

Uploaded by

saptarshimaity01
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views9 pages

Endsem ML All Pyq

Uploaded by

saptarshimaity01
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Total No. of Questions : 8] SEAT No.

8
23
PA-913 [Total No. of Pages : 3

ic-
[5927]-343

tat
1s
B.E. (Computer)

5:1
02 91
MACHINE LEARNING

3:4
0
(2019 Pattern) (Semester - VII) (410242)

31
2/0 13
0
1/2
Time : 2½ Hours] [Max. Marks : 70
.23 GP

Instructions to the candidates:


E

1) Solve Q.1 or Q.2, Q.3 or Q.4, Q.5 or Q6, Q.7 or Q.8.


81

8
C

23
2) Neat diagrams must be drawn wherever necessary.

ic-
3) Figures to the right indicate full marks.
16

tat
8.2

4) Make suitable assumption whenever necessary.

1s
.24

5:1
91
49

3:4
Q1) a) Explain in brief techniques to reduce under fitting and over fitting. [6]
30
31

b) Find the Equation of linear Regression line using following data : [6]
01
02

X Y
1/2
GP
2/0

1 3
CE
81

8
2 4

23
.23

3 5
ic-
16

tat
4 7
8.2

1s
.24

5:1

c) Write short note on : [6]


91
49

3:4

i) MAE
30
31

ii) RMSE
01
02

iii) R 2
1/2
GP
2/0

OR
CE
81

Q2) a) Explain in brief lasso and Ridge Regression. [6]


.23

b) What is Bias and variance trade off for machine learning model? [6]
16
8.2

c) Write short note on Evaluation metrics. [6]


.24
49

P.T.O.
Q3) a) Explain in brief methods used for Evaluating classification models.

8
[5]

23
ic-
b) Consider the following data to predict the student pass or fail using the

tat
K-Nearest Neighbor Algorithm (KNN) for the values physics = 6 marks,

1s
Chemistry = 8 marks with number of Neighbors K = 3. [6]

5:1
02 91
Physics (marks) Chemistry (marks) Results

3:4
0
4 3 Fail

31
2/0 13
6 7 Pass
0
1/2
.23 GP

7 8 Pass
5 5 Fail
E
81

8
C

23
8 8 Pass

ic-
16

tat
c) Write short note on Ensemble learing methods : [6]
8.2

1s
i) Simple
.24

5:1
91
ii) Advanced
49

3:4
30
31

OR
01
02

Q4) a) Explain Random forest Algorithm with example. [5]


1/2
GP

b) Write short note on importance of confusion matrix. [6]


2/0
CE

c) Define following terms with reference to SVM. [6]


81

8
23
.23

i) Separating hyperplane
ic-
16

tat
ii) Margin
8.2

1s
.24

5:1
91
49

Q5) a) Explain Density Based clustering with refence to DBSCAN, OPTICS


3:4
30

and DENCLUE. [6]


31
01

b) What is K mean clustering? Explain with example. [6]


02
1/2
GP

c) Write short note on following Hierarchical clustering method : [6]


2/0

i) Agglomerative
CE
81

ii) Dendogram
.23
16

OR
8.2
.24
49

[5927]-343 2
Q6) a) What is LOF? Explain it with it's advantages and disadvantages. [6]

8
23
b) Explain Graph Based clustering. [6]

ic-
c) Define following terms : [6]

tat
1s
i) Elbow method

5:1
02 91
ii) Extrinsic and Intrinsic method

3:4
0
31
2/0 13
Q7) a) Explain ANN with it's Architecture. [5]
0
1/2
.23 GP

b) Obtain the output of Neuron Y for the Network shown in following


fig. Using activation function as : [6]
E
81

8
C

23
i) Binary sigmoidal

ic-
ii) Bipolar sigmoidal
16

tat
8.2

1s
.24

5:1
91
49

3:4
30
31
01
02
1/2
GP
2/0
CE
81

8
23
c) Write short note on Back propagation network. [6]
.23

OR ic-
16

tat
8.2

1s

Q8) a) Explain in brief types of ANN based on layers. [5]


.24

5:1

b) What is Recurrent Neural Network? Explain with suitable example.


91
49

3:4

[6]
30
31

c) Write short note on with refernce with CNN. [6]


01
02

i) Convolution layer
1/2
GP
2/0

ii) Hidden layer


CE
81
.23


16
8.2
.24
49

[5927]-343 3
Total No. of Questions : 8] SEAT No. :

8
23
P546 [Total No. of Pages : 2
[6004]-481

ic-
tat
B.E. (Computer Engineering)

5s
MACHINE LEARNING

7:2
(2019 Pattern) (Semester - VII) (410242)

02 91
4:3
Time : 2½ Hours] [Max. Marks : 70

0
31
Instructions to the candidates:
2/0 13
1) Attempt Q.1 or Q.2, Q.3 or Q.4, Q.5 or Q.6 and Q.7 or Q.8.
0
2) Figures to the right indicate full marks.
6/2
.23 GP

3) Neat diagrams must be drawn wherever necessary.


4) Assume suitable data, if necessary.
E
82

8
C

23
ic-
Q1) a) Explain the following terms with suitable examples. [6]
16

tat
i) Bias
8.2

5s
ii) Variance
.24

7:2
91
iii) Under fitting and Over fitting
49

4:3
30

b) Differentiate between Lasso Regression and Ridge Regression. [6]


31
01

c) Explain gradient descent algorithm with example. [6]


02
6/2
GP

OR
2/0

Q2) a) What do you mean by regression? Explain with suitable example. [6]
CE
82

38
b) Write a short note on : [6]

c-2
.23

i) MAE
i
16

tat
ii) RMSE
8.2

5s

iii) R2
.24

7:2
91
49

c) What is gradient descent? Compare batch gradient and stochastic gradient


4:3
30

descent. [6]
31
01
02
6/2

Q3) a) Explain with example the variant of SVM, the support vector regression.
GP

[5]
2/0

b) What do you mean by ensemble learning?Differentiate between bagging


CE
82

& boosting. [6]


.23

c) What are different variants of multi-class classification? Explain them


16

with suitable example. [6]


8.2

OR
.24

P.T.O.
49
Q4) a) Calculate macro average precision, macro average recall and macro

8
23
average F-score for the following given confusion matrix of multi-class

ic-
classification. [6]

tat
Predictions 

5s
A B C D

7:2
A 100 80 10 10

02 91
4:3
B 0 9 0 1

0
Actual values 

31
2/0 13 C 0 1 8 1
D 0 1 0 9
0
6/2
.23 GP

b) Write a short note on : [6]


i) Random forest.
E
82

8
ii) Adaboost.
C

23
c) Discuss K-nearest neighbour algorithm with suitable example. [5]

ic-
16

tat
8.2

5s
Q5) a) With reference to Clustering explain the issue of “Optimization of Clusters”.
.24

7:2
[6]
91
49

b) Compare Hierarchical clustering and K-means clustering. [6]


4:3
30

c) Explain how a cluster is formed in the density based clustering algorithm.


31

[6]
01
02

OR
6/2
GP

Q6) a) How would you choose the number of clusters when designing a K-
2/0

Medoid clustering algorithm? [6]


CE
82

38
b) Write a short note on out lier analysis with respect to clustering. [6]

c-2
.23

c) Differentiate between K-means and Spectral clustering. [6]


i
16

tat
8.2

5s

Q7) a) What are building blocks of neural network, elaborate? [5]


.24

7:2

b) Describe characteristics of back propagation algorithm. [6]


91
49

4:3

c) Write a short note on Recurrent neural n/w & convolutional neural n/w.
30

[6]
31

OR
01
02

Q8) a) Explain artificial neural n/w based on perception concept with diagram.
6/2
GP

[6]
2/0

b) Describe multi-layer neural n/w. Explain why back propagation algorithm


CE
82

is required. [6]
.23

c) Discuss any two activation functions with example. [5]


16


8.2
.24

[6004]-481 2
49
Total No. of Questions : 8] SEAT No. :

8
23
P-6552 [Total No. of Pages : 2

ic-
[6181]-102

tat
5s
B.E. (Computer Engineering)

7:5
02 91
MACHINE LEARNING

3:3
0
(2019 Pattern) (Semester - VII) (410242)

31
2/1 13
0
Time : 2½ Hours] [Max. Marks : 70
2/2
.23 GP

Instructions to the candidates:


1) Answer Q1 or Q2, Q3 or Q4, Q5 or Q6, Q7 or Q8.
E
80

8
C

23
2) Neat diagrams must be drawn wherever necessary.

ic-
3) Figures to the right side indicate full marks.
16

tat
4) Assume suitable data if necessary.
8.2

5s
.24

7:5
Q1) a) Differentiate between overfitting and underfitting. [6]
91
49

3:3
b) The table below shows the number of grams of carbohydrates, X and
30
31

the number of Calories, Y of six different foods. Find linear regression


equation for this dataset. [8]
01
02
2/2

Carbohydrates (X) 8 9.5 10 6 7 4


GP
2/1

Calories (Y) 12 138 147 88 108 62


CE

Also find the value of Y for X = 12


80

8
23
c) Explain Bias Variance Trade off. [4]
.23

OR ic-
16

tat
8.2

Q2) a) What is Linear Regression? Explain the concept of Ridge regression.


5s

[9]
.24

7:5
91

b) Explain the following Evaluation Metrics : [9]


49

3:3
30

i) MAE
31

ii) RMSE
01
02

iii) R2
2/2
GP
2/1
CE

Q3) a) Differentiate between bagging and boosting. [4]


80

b) What is ensemble learning? Explain the concept of Random Forest


.23

ensemble learning. [9]


16

c) What is the relation between precision and recall? Explain with an


8.2

example. [4]
.24
49

P.T.O.
OR

8
23
Q4) a) What is K-fold cross-validation? In K-fold cross-validation, comment

ic-
on the following situations [9]

tat
i) When the value of K is too large

5s
7:5
ii) When the value of K is too small.

02 91
3:3
How do you decide the value of k in k-fold cross-validation?

0
31
b) 2/1 13
Explain i) Accuracy, ii) Precision, iii) Recall, and iv) F-Score [8]
0
2/2
.23 GP

Q5) a) Explain K-Means clustering in detail with a suitable example. [8]


E
80

8
C

23
b) What is outlier analysis? How is Local Outlier Factor detected? [5]

ic-
c) Explain Spectral Cluster in galgorithm. [5]
16

tat
8.2

5s
OR
.24

7:5
Q6) a) Explain Hierarchical and Density-based Clustering approaches. [9]
91
49

3:3
b) Write short note on : [9]
30
31

i) Optimization of clusters
01
02
2/2

ii) K-Medoids
GP
2/1

iii) Evaluation metrics


CE
80

8
23
.23

Q7) a) Write a note on Single Layer Neural Network. ic-[4]


16

tat
8.2

5s

b) Explain Radial Basis Function networks in detail. [8]


.24

7:5

c) Explain Recurrent Neural Networks and its applications in brief. [5]


91
49

3:3
30

OR
31
01

Q8) a) Explain the concept of Back Propagation in ANN with example. [8]
02
2/2
GP

b) What is Functional Link Artificial Neural Network (FLANN)? Explain


2/1

its merits over other ANNs. [5]


CE
80

c) What is Activation Function? Explain with a suitable example. [4]


.23
16
8.2


.24
49

[6181]-102 2
Total No. of Questions : 8] SEAT No. :

8
23
PB-2244 [Total No. of Pages : 2

ic-
tat
[6263]-82

1s
1:5
B.E. (Computer Engineering)

02 91
3:3
MACHINE LEARNING

0
41
4/0 13
(2019 Pattern) (Semester - VII) (410242)
0
5/2
Time : 2½ Hours] [Max. Marks : 70
.23 GP

Instructions to the candidates :


E
81

1) Answer Q.1 or Q.2, Q.3 or Q.4, Q.5 or Q.6, Q.7 or Q.8.

8
C

23
2) Figures to the right side indicate full marks.

ic-
3) Draw neat diagram wherever necessary.
16

tat
4) Assume suitable data, if necessary.
8.2

1s
.24

1:5
91
Q1) a) Define different regression models. [6]
49

3:3
30

b) What are different techniques to reduce under fitting? [6]


41
01
02

c) With following data of shows company expenditure. [6]


5/2
GP

x(month) 1 2 3 4 5
4/0

y(expenditure) 12 19 29 37 45
CE
81

8
23
using regression model predict expenditure of 6th month.
.23

OR ic-
16

tat
8.2

1s

Q2) a) What is R2 measure of evaluation? [6]


.24

1:5

b) What do you mean by least square method? Explain least square method
91
49

3:3

in the context of linear regression. [6]


30
41

c) Write a short note on stochastic qradient descent algorithms. [6]


01
02
5/2
GP

Q3) a) Why ensemble learning is used for ML? [5]


4/0
CE

b) What are advantages and disadvantages of K-NN? [6]


81
.23

c) What are different distance metrics used in k-NN? [6]


16

OR
8.2

P.T.O.
.24
49
Q4) a) What is multiclass classification? Explain the variants of multiclass

8
23
classification. [5]

ic-
b) Explain kernel methods which are suitable for SVM. [6]

tat
1s
c) What are different techniques used for outlier handling? [6]

1:5
02 91
3:3
Q5) a) Why K-medoid is used? Explain k-medoid algorithm. [5]

0
41
b) 4/0 13
Why density based clustering is used? Explain any one. [6]
0
5/2
c) What is outlier analysis? [6]
.23 GP

OR
E
81

8
Q6) a) What is isolation factor model? [5]
C

23
ic-
b) Explain k means algorithm. [6]
16

tat
c) Explain Hierarchical clustering with example [6]
8.2

1s
.24

1:5
91
49

3:3
Q7) a) What is Multilayer perceptron? Describe with diagram. [6]
30
41

b) What are different activation function used is NN? [6]


01
02

c) Explain Convolution Neural Network. (CNN) with suitable example. [6]


5/2
GP

OR
4/0
CE

Q8) a) Explain building blocks of RBF networks. [6]


81

8
23
.23

b) What is Personalized recommendation? What is content based


recommendation? [6] ic-
16

tat
8.2

1s

c) Explain Recurrent Neural Networks with as example. [6]


.24

1:5
91
49

3:3


30
41
01
02
5/2
GP
4/0
CE
81
.23
16
8.2
.24

[6263]-82 2
49

You might also like