0% found this document useful (0 votes)
74 views4 pages

Deep Learning Course Syllabus Overview

The document outlines the syllabus for a Deep Learning course (MCSE603L) and its corresponding lab (MCSE603P), detailing course objectives, outcomes, and modules covering various aspects of deep learning including neural networks, convolutional networks, recurrent networks, and advanced architectures. It includes a breakdown of lecture and lab hours, recommended textbooks, and evaluation methods. The course aims to equip students with practical skills in implementing and applying deep learning frameworks and models.

Uploaded by

Raja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views4 pages

Deep Learning Course Syllabus Overview

The document outlines the syllabus for a Deep Learning course (MCSE603L) and its corresponding lab (MCSE603P), detailing course objectives, outcomes, and modules covering various aspects of deep learning including neural networks, convolutional networks, recurrent networks, and advanced architectures. It includes a breakdown of lecture and lab hours, recommended textbooks, and evaluation methods. The course aims to equip students with practical skills in implementing and applying deep learning frameworks and models.

Uploaded by

Raja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Item 67/31 - Annexure - 35

Course Code Course Title L T P C


MCSE603L Deep Learning 2 0 0 2
Pre-requisite Nil Syllabus version
1.0
Course Objectives
1. Introduce major deep neural network frameworks and issues in basic neural
networks
2. To solve real-world applications using Deep learning
3. Providing insight into recent Deep Learning architectures

Course Outcomes
At the end of this course, students will be able to:
1. Understand the methods and terminologies involved in deep neural network,
differentiate the learning methods used in Deep-nets.
2. Identify and improve Hyper parameters for better Deep Network Performance
3. To understand and visualize Convolutional Neural Network for real-world applications
4. To demonstrate the use of Recurrent Neural Networks and Transformer based for
language modeling
5. To distinguish different types of Advanced Neural Networks

Module:1 Neural Networks 3 hours


The Neuron –Expressing Linear Perceptrons as Neurons – Feed-Forward Neural Networks
– Linear Neurons and their Limitations – Sigmoid, Tanh and Relu Functions – Softmax
Output Layers
Module:2 Neural Learning 4 hours
Measuring Errors - Gradient Descent – Delta Rule and Learning Rate – Backpropagation –
Stochastic and Minibatch Gradient – Test Sets, Validation Sets and Overfitting – Preventing
Overfitting in Deep Neural Networks – Other Optimization Algorithms: Adagrad, RMSProp,
Adadelta, Adam
Module:3 Convolution Neural Networks 5 hours
Neurons in Human Vision – Shortcomings of Feature Selection –Scaling Problem in Vanilla
Deep Neural Networks – Filters and Feature Maps – Description of Convolutional Layer –
Maxpooling – Convolution Network Architecture – Image Classification
Module:4 Pre-Trained Models 3 hours
Self-Supervised Pretraining, AlexNet, VGG, NiN, GoogleNet, Residual Network (ResNet),
DenseNet, Region-Based CNNs (R-CNNs) – Transfer Learning - FSL
Module:5 Recurrent Neural Networks 6 hours
Sequence-to-Sequence Modeling – Embedding - Recurrent Neural Networks - Bidirectional
RNNs, Analyzing Variable Length Inputs – Tackling seq2seq Problem – Beam Search and
Global Normalization – Recurrent Neural Networks (RNN)– Hidden States – Perplexity –
Character-level Language Models –Modern RNNs: Gated Recurrent Units (GRU), Long
Short Term Memory (LSTM), Bidirectional Long Short Term Memory (BLSTM), Deep
Recurrent Neural Network, Bidirectional RNN
Module:6 Attention Models and Transformers 4 hours
Attention Mechanism: Attention Cues, Attention Pooling, Scoring Functions, Self-Attention
and Positional Encoding;–Bidirectional Encoder Representations from Transformers (BERT)
– Generative Pre-trained Transformers
Module:7 Advanced Neural Networks 4 hours
Generative Adversarial Networks – Generator, Discriminator, Training, GAN variants;
Autoencoder: Architecture, Denoising and Sparcity; DALL-E, DALL-E 2 and IMAGEN

Proceedings of the 67th Academic Council (08.08.2022) 1695


Item 67/31 - Annexure - 35

Module:8 Contemporary Issues 1 hour

Total Lecture hours: 30 Hours

Text Book(s)
1. Fundamentals of Deep Learning, Nikhil Buduma and Nicholas Locasio, O-Reilly,
2017
2. Dive into Deep Learning, Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J.
Smola, Amazon Senior Scientists – Open source and Free Book, March 2022
Reference Books
1. Deep Learning, Ian Goodfellow Yoshua Bengio Aaron Courville, MIT Press, 2017
2. Deep Learning: A Practitioner's Approach, Josh Patterson, Adam Gibson, O'Reilly
Media, 2017
Mode of Evaluation: CAT / Written Assignment / Quiz / FAT

Recommended by Board of Studies 26-07-2022


Approved by Academic Council No. 67 Date 08-08-2022

Proceedings of the 67th Academic Council (08.08.2022) 1696


Item 67/31 - Annexure - 35

Course Code Course Title L T P C


MCSE603P Deep Learning Lab 0 0 2 1
Pre-requisite NIL Syllabus version
1.0
Course Objectives
1. To understand deep neural network frameworks and learn to implement them
2. To learn to use pretrained models effectively and use them to build potential solutions
Course Outcomes
At the end of this course, student will be able to:
1. Understand the methods and terminologies involved in deep neural network,
differentiate the learning methods used in Deep-neural nets.
2. Identify and apply suitable deep learning approaches for given application.
3. Design and develop custom Deep-nets for human intuitive applications
4. Design of test procedures to assess the efficiency of the developed model.
5. Apply and evaluate Pre-trained models to improve the models’ performance.
Indicative Experiments
1. Python Primer 6 hours
Revisiting Data Preprocessing
Setting up Deep-Learning workstations
Working with different data types and file formats
2. Simple Classification Tasks 4 hours
Working with MNIST – IMDB Datasets
3. Training a CNN from Scratch 6 hours
Using pretrained CNNs
4. Visualizing what CNNs are learning – Intermediate Activations, Convnet 2 hours
Filters, Heatmaps
5. Exploring Multi-Input, Multi-output Models 2 hours
Hyper-parameter Tuning
6. Language Modeling using RNN 3 hours
Practicing of Stacking Layers in Bidirectional RNNs
7. Transfer Learning models for classification problems 2 hours
Exploring Hugging-face API
8. Text Generation Using LSTM 2 hours
9. Image generation from Text using GAN 3 hours
Total Laboratory Hours 30 hours
Text Book(s)
1. Deep Learning Step by Step with Python, N D Lewis, 2016
2 Neural Networks and Deep Learning, Michael Nielsen,, Determination Press
Reference Books
1. Deep Learning: A Practitioner's Approach, Josh Patterson, Adam Gibson, O'Reilly
Media, 2017
2. Applied Deep Learning. A Case-based Approach to Understanding Deep Neural
Networks, Umberto Michelucci, Apress, 2018.
3. Deep Learning with TensorFlow: Explore neural networks with Python, Giancarlo
Zaccone, Md. RezaulKarim, Ahmed Menshawy, Packt Publisher, 2017.
Mode of Evaluation: CAT / Mid-Term Lab/ FAT
Recommended by Board of Studies 26-07-2022
Approved by Academic Council No. 67 Date 08-08-2022

Proceedings of the 67th Academic Council (08.08.2022) 1697

You might also like