0% found this document useful (0 votes)
8 views

CS-3-LESSON PLAN

The document outlines the course details for Natural Language Processing (NLP) IT-3035 at KIIT for Spring 2023, including class timings, instructor information, and grading policy. It specifies the topics to be covered, the number of lectures for each topic, and the expected course outcomes for students. Additionally, it lists required textbooks and reference materials for the course.

Uploaded by

21052387
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

CS-3-LESSON PLAN

The document outlines the course details for Natural Language Processing (NLP) IT-3035 at KIIT for Spring 2023, including class timings, instructor information, and grading policy. It specifies the topics to be covered, the number of lectures for each topic, and the expected course outcomes for students. Additionally, it lists required textbooks and reference materials for the course.

Uploaded by

21052387
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

School of Computer Engineering

Kalinga Institute of Industrial Technology (KIIT)


Deemed to be University
Bhubaneswar-751024

Spring 2023
Natural Language Processing (NLP)-IT-3035 (L-T-P:2-1-0)

Instructor Name: Dr.Mainak Bandyopadhyay


Email: [email protected]

Time Table:-

NLP CS-3 Class Timing


Tuesday (4:00-5:00PM)
Wednesday (11:00-12:00 AM)
Friday (1:00-2:00 PM)

Course Objectives:- This is a core course, open to 3rd year B.Tech.


(CSE) students. The course (IT-3035) objective is to help students
acquire skills and understand the concepts of Natural Language
Processing (NLP) for various fundamental problems.

Grading Policy:-

 Assignments/quizzes/activities: 30% (Inside-class 20%)


 Mid-semester exam: 20% (syllabus upto Hidden Markov Models &
Tagging (tentative) i.e. 21st class)
 End-semester exam: 50%.
We will have 6 short assignments/quizzes/activities over the
semester, at the end of every topic. All examinations will be closed
notes and closed book. There will be no make-up exams, unless in the
case of emergencies. Any form of cheating will not be tolerated during
the tests.

1
Lecture-wise plan:-

Topics to be covered No. of


lectures
(Lecture Nos)
Introduction 8
Basic Probability & Information Theory: Introduction to NLP, Main Issues,
Various types of ambiguity in NLP, Basics on Probability Theory, Elements of
Information Theory, Language Modeling in General and Noisy Channel Model,
Smoothing and EM Algorithm. Basics of Spelling Correction.

Tutorials / Activity
Linguistics 4
Phonology and Morphology, Syntax (Phrase Structure vs. Dependency).
Words & Lexicon 6
Bilingual Lexicon, word boundary, sentence boundary, words and vectors (tf-
idf)

Naive Bayes and Sentiment Classification 4


Naive Bayes Classifiers, Training the Naive Bayes Classifier, Worked
example, Optimizing for Sentiment Analysis, Naive Bayes for other text
classification tasks, Naive Bayes as a Language Model, Evaluation:
Precision, Recall, F-measure.
Tutorials / Activity
Hidden Markov Models & Tagging 10
Markov Models, Hidden Markov Models (HMMs), Prellis Algorithm, Viterbi
Algorithm. Estimating the Parameters of HMMs, The Forward-Backward
Algorithm, Implementation Issues, Task of Tagging, Tag sets, Morphology,
Lemmatization, Tagging Methods, Manually Designed Rules and Grammars,
Statistical Methods, Various tagging methods(Rule based, Stochastic,
Transformation), HMM Tagging. Evaluation Methodology, Precision, Recall,
Accuracy, Maximum Entropy, Various Natural Languages.

Tutorials / Activity
Grammars & Parsing Algorithms 8
Introduction to Parsing, Generative Grammars, Properties of Regular and
Context-free Grammars, Overview on Non-statistical Parsing Algorithms,
Left Most Derivation, Right Most Derivation, Derivation tree, Ambiguous
grammar, Simple Top-Down Parser with Backtracking.
Tutorials / Activity
Machine Translation 8
Statistical Machine Translation (MT), IBM model-1, IBM model 2,
Alignment and Parameter Estimation for MT.
Tutorials / Activity

Course Outcome:

2
At the end of the course, the students will be able to:

CO1: Understand the concepts of NLP and its algorithms.


CO2: Evaluate different computing architectures for natural language
processing for various parameters.
CO3: Apply the different modelling and tagging concepts for language
processing.
CO4: Analyze the various grammars & parsing algorithms.
CO5: Explore the role of statistical parsing & machine translation.
CO6: Implementation of the role of natural language processing in real
life applications.

Practice Problem Sets:-

 The practice questions will come from your Text book and other
Reference books.

Text Books:-
1. Speech and Language Processing, Jurafsky, D. and J. H. Martin,
Prentice-Hall.

Reference Books:-
1. Foundations of Statistical Natural Language Processing, Manning,
C. D. and H. Schutze, TheMIT Press.
2. Natural Language Understanding, Allen, J., The
Benajmins/Cummings Publishing Company Inc.
3. Elements of Information Theory, Cover, T. M. and J. A. Thomas,
Wiley.
4. Statistical Language Learning, Charniak, E., The MIT Press.
5. Statistical Methods for Speech Recognition, Jelinek, F., The MIT
Press.

You might also like