Deep Learning QP
Deep Learning QP
2
1. Inputs X, arrive through the preconnected path
2. Input is modelled using real weights W. The weights are usually randomly selected.
3. Calculate the output for every neuron from the input layer, to the hidden layers, to the
output layer.
4. Calculate the error in the outputs
ErrorB= Actual Output – Desired Output
5. Travel back from the output layer to the hidden layer to adjust the weights such that the
error is decreased.
Why We Need Backpropagation?
Most prominent advantages of Backpropagation are:
OR
11. (b) Describe any two methods of regularization in deep learning.
https://www.analyticsvidhya.com/blog/2018/04/fundamentals-deep-learning-regularization-
techniques/
12. (a) Explain how to fix the vanishing gradient problem using ReLu
OR
12. (b) Illustrate Gradient Gradient Descent With Nesterov Momentum
Explain Transfer Learning For Multi-Class Image Classification Using Deep Convolutional
13. (a)
Neural Network
OR
13. (b) Write a short note on layer patterns and layer size patterns .
3
14. (a) Illustrate the working of Generative Adversarial Networks in detail
OR
14. (b) i Compare Denoising with Contractive Autoencoders
Write the similarities and differences between Gated Recurrent Unit and RNN. With a
ii
neat sketch explain GRU.
https://medium.com/analytics-vidhya/rnn-vs-gru-vs-lstm-863b0b7b1573
https://www.analyticsvidhya.com/blog/2021/03/introduction-to-gated-recurrent-unit-gru/
15. (a) Explain about SSD and YOLO for object detection. Spot the difference of these approaches
with Faster-RCNN and also write the scenario of each.
https://cv-tricks.com/object-detection/faster-r-cnn-yolo-ssd/
OR
15. (b) How the Deep Learning Techniques will enhance the image processing. Explain.
******************