0% found this document useful (0 votes)
162 views

Assignment 6 (COPY)

Uploaded by

geetha megharaj
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
162 views

Assignment 6 (COPY)

Uploaded by

geetha megharaj
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Natural Language Processing

Assignment- 6
TYPE OF QUESTION: MCQ
Number of questions: 10 Total marks: 10 X 1 = 10

Question 1.

With respect to a Dependency Structure, which of the following are valid criteria for a
syntactic relation between a head H and a dependent D in a construction C?

a. The form of D depends on H.


b. H gives semantic specification of C.
c. D selects H and determines whether H is obligatory.
d. D specifies H.

Answer: a, b, d
Solution: Refer to Week 6 Lecture 1.

Question 2:

Consider the sentence: “Zidane scored a brilliant goal”. What is the type of the following
two relations?
a) goal -> brilliant
b) scored -> Zidane

a. Both Endocentric
b. Both Exocentric
c. Exocentric, Endocentric
d. Endocentric, Exocentric

Answer: d
Solution: Check the Dependency Parse of the sentence using Spacy at
https://spacy.io/usage/linguistic-features#dependency-parse
Question 3:

Which of the following is not a necessary condition for Dependency Parsing?

a. Connectedness.
b. Acyclicity.
c. Singe-head.
d. Projectivity.

Answer: d
Solution: Projectivity constraint does not allow crossing of dependencies. Students interested
in Non-projective Dependency Parsing may refer to the paper at the following link:
https://www.seas.upenn.edu/~strctlrn/bib/PDF/nonprojectiveHLT-EMNLP2005.pdf

Question 4:

Correct sequence of actions that generates the following parse tree of the sentence “he
sent her a funny meme today” using Arc-Eager Parsing is:

NOTE: Initial configuration -> Stack: [root], Buffer: [entire sentence], Arcs: {}

a. SH -> LA -> RA -> RA -> RE -> SH -> SH -> LA -> LA -> RA -> RE -> RA
b. SH -> LA -> RA -> RA -> SH -> SH -> LA -> LA -> RE -> RA -> RE -> RA
c. SH -> LA -> RA -> RA -> SH -> SH -> LA -> LA -> RE -> RA -> RE -> RA -> RE
d. SH -> LA -> RA -> RA -> SH -> SH -> LA -> LA -> RE -> RA -> RE -> RA -> RE -> RA

Answer: b
Solution: Option (a) fails to generate the edge between ‘a’ and ‘meme’. Option (c) is not correct
since we stop as soon as the buffer becomes empty. Option (d) would have been correct if there
were a full stop at the end of the sentence and a right-arc were to exist from ‘sent’ to ‘.’.
Question 5:

Spacy Dependency Parser generates the following tree upon parsing the sentence “She
went to the hospital”

TEXT DEP HEAD TEXT HEAD POS CHILDREN


She nsubj went VERB []
went ROOT went VERB [She, to]
to prep went VERB [hospital]
the det hospital NOUN []
hospital pobj to ADP [the]

Here, the dependent “She” is connected with the head word “went” (a verb) with an “nsubj”
relation. Further, no word is dependent on “She”.

Obtain the dependency parse of the sentence: “She sells sea shells on the sea shore”.
What is the head word for “shore”? Also, how many words are dependent on “shore”?

Note: Access the parser at: https://spacy.io/usage/linguistic-features#dependency-parse

a. the, 3
b. sea, 2
c. on, 2
d. on, 3

Answer: c
Solution:

Question 6:

While learning a classifier for the data-driven deterministic parsing, the size of the
feature vector for any configuration depends on:

a. Length of the sentence being parsed.


b. No. of distinct POS tags appearing in the sentence.
c. No. of conditions (features) defined.
d. No. of conditions (features) defined and no. of possible oracle transitions.

Answer: d
Solution: Refer to Week 6 Lecture 3.
Question 7:

Assume that you are learning a classifier for the data-driven deterministic parsing and
the sentence ‘Alice met Bob in Houston’ is a gold-standard parse in your training data.
You are also given that 'Alice', ‘Bob’, and ‘Houston’ are NOUNs, ‘met’ is a VERB, and ‘in’
is a PREPOSITION. Obtain the dependency graph for this sentence on your own. Assume
that your features correspond to the following conditions:

1. The stack is empty.


2. Top of stack is Noun and Top of buffer is Verb.
3. Top of stack is Verb and Top of buffer is Noun.
4. Top of stack is Verb and Top of buffer is Preposition.
5. Top of stack is Preposition and Top of buffer is Noun.
6. Top of stack is Noun and Top of buffer is Preposition.

Initialize the weights of all your features to 5.0, except that in all of the above cases, you
give a weight of 6.0 to Right-Arc. Use this gold standard parse during online learning.
The sequence of predicted transitions and the final weights after completing one full
iteration of Arc-Eager parsing over this sentence are:

a. SH->LA->SH->RA->RE->RA->RA ; [5,6,5,5,5,5,| 5,5,6,6,6,5,|5,5,5,5,5,6,| 6,5,5,5,5,5]


b. RA->RA->RA->RA->RA->RA->RA ; [5,6,5,5,5,5,| 5,5,6,6,6,5,|5,5,5,5,5,6,| 6,5,5,5,5,5]
c. RA->RA->SH->RA->RA->RA->RA ; [5,6,5,5,5,5,| 5,5,6,6,6,5,|5,5,5,5,5,6,| 6,5,5,5,5,5]
d. RA->RA->SH->RA->RA->RA->RA ; [6,5,5,5,5,5,| 5,6,6,6,6,5,|5,5,5,5,5,6,| 6,5,5,5,5,5]

Answer: c
Solution: Refer to Week 6 Lecture 3.

Question 8:

Let V be the set of nodes and E be the set of directed edges. The running time of Chu-
Liu-Edmonds Algorithm for dense graphs is:

a. O(V2)
b. O(EV)
c. O(ElogV)
d. O(logVE)

Answer: a
Solution: Refer to this link https://en.wikipedia.org/wiki/Edmonds%27_algorithm
Question 9:

For the below weighted directed graph G, which of the following are true with respect to
the maximum spanning tree, MST(G) ?

a. MST(G) does not contain an edge between “Einstein” and “played”.


b. MST(G): c:5 -> i:12 -> g:7
c. Weight of MST(G) = 26
d. f ∈ MST(G)

Answer: a, c
Solution: Solve by applying Chu-Liu-Edmonds Algorithm
Question 10:

Suppose you are training MST Parser for dependency and the sentence, “Einstein played
violin” occurs in the training set, where ‘Einstein’ and ‘violin’ are NOUNs and ‘played’ is a
VERB. For simplicity, assume that there is only one dependency relation, “rel”. Thus, for
every arc from word wi to wj, your features may be simplified to depend only on the
words wi and wj and not on the relation label.

Below is the set of features:

● f 1 : POS( wi ) = Noun and POS( wj ) = Noun


● f 2 : POS( wi ) = Verb and POS( wj ) = Noun
● f 3 : wi = Root and POS( wj ) = Verb
● f 4 : wi = Root and POS( wj ) = Noun
● f 5 : wi = Root and wj occurs at the end of sentence
● f 6 : wi occurs before wj in the sentence
● f 7 : POS( wi ) = Noun and POS( wj ) = Verb

The feature weights before the start of the iteration are: {10, 15, 8, 18, 2, 6, 20}.
Obtain the gold-standard MST from Question 9. Determine the weights after an iteration
over this example.

a. [11, 14, 8, 19, 3, 4, 21]


b. [11, 14, 8, 18, 3, 4, 20]
c. [11, 14, 8, 18, 3, 5, 20]
d. [11, 14, 8, 19, 3, 5, 21]

Answer: b
Solution: Refer to Week 6 Lecture 5
Predicted Tree: root -> Einstein -> played -> violin.
Gold standard Tree: root -> violin -> played, violin -> Einstein.

You might also like