Assignment 6 (COPY)
Assignment 6 (COPY)
Assignment- 6
TYPE OF QUESTION: MCQ
Number of questions: 10 Total marks: 10 X 1 = 10
Question 1.
With respect to a Dependency Structure, which of the following are valid criteria for a
syntactic relation between a head H and a dependent D in a construction C?
Answer: a, b, d
Solution: Refer to Week 6 Lecture 1.
Question 2:
Consider the sentence: “Zidane scored a brilliant goal”. What is the type of the following
two relations?
a) goal -> brilliant
b) scored -> Zidane
a. Both Endocentric
b. Both Exocentric
c. Exocentric, Endocentric
d. Endocentric, Exocentric
Answer: d
Solution: Check the Dependency Parse of the sentence using Spacy at
https://spacy.io/usage/linguistic-features#dependency-parse
Question 3:
a. Connectedness.
b. Acyclicity.
c. Singe-head.
d. Projectivity.
Answer: d
Solution: Projectivity constraint does not allow crossing of dependencies. Students interested
in Non-projective Dependency Parsing may refer to the paper at the following link:
https://www.seas.upenn.edu/~strctlrn/bib/PDF/nonprojectiveHLT-EMNLP2005.pdf
Question 4:
Correct sequence of actions that generates the following parse tree of the sentence “he
sent her a funny meme today” using Arc-Eager Parsing is:
NOTE: Initial configuration -> Stack: [root], Buffer: [entire sentence], Arcs: {}
a. SH -> LA -> RA -> RA -> RE -> SH -> SH -> LA -> LA -> RA -> RE -> RA
b. SH -> LA -> RA -> RA -> SH -> SH -> LA -> LA -> RE -> RA -> RE -> RA
c. SH -> LA -> RA -> RA -> SH -> SH -> LA -> LA -> RE -> RA -> RE -> RA -> RE
d. SH -> LA -> RA -> RA -> SH -> SH -> LA -> LA -> RE -> RA -> RE -> RA -> RE -> RA
Answer: b
Solution: Option (a) fails to generate the edge between ‘a’ and ‘meme’. Option (c) is not correct
since we stop as soon as the buffer becomes empty. Option (d) would have been correct if there
were a full stop at the end of the sentence and a right-arc were to exist from ‘sent’ to ‘.’.
Question 5:
Spacy Dependency Parser generates the following tree upon parsing the sentence “She
went to the hospital”
Here, the dependent “She” is connected with the head word “went” (a verb) with an “nsubj”
relation. Further, no word is dependent on “She”.
Obtain the dependency parse of the sentence: “She sells sea shells on the sea shore”.
What is the head word for “shore”? Also, how many words are dependent on “shore”?
a. the, 3
b. sea, 2
c. on, 2
d. on, 3
Answer: c
Solution:
Question 6:
While learning a classifier for the data-driven deterministic parsing, the size of the
feature vector for any configuration depends on:
Answer: d
Solution: Refer to Week 6 Lecture 3.
Question 7:
Assume that you are learning a classifier for the data-driven deterministic parsing and
the sentence ‘Alice met Bob in Houston’ is a gold-standard parse in your training data.
You are also given that 'Alice', ‘Bob’, and ‘Houston’ are NOUNs, ‘met’ is a VERB, and ‘in’
is a PREPOSITION. Obtain the dependency graph for this sentence on your own. Assume
that your features correspond to the following conditions:
Initialize the weights of all your features to 5.0, except that in all of the above cases, you
give a weight of 6.0 to Right-Arc. Use this gold standard parse during online learning.
The sequence of predicted transitions and the final weights after completing one full
iteration of Arc-Eager parsing over this sentence are:
Answer: c
Solution: Refer to Week 6 Lecture 3.
Question 8:
Let V be the set of nodes and E be the set of directed edges. The running time of Chu-
Liu-Edmonds Algorithm for dense graphs is:
a. O(V2)
b. O(EV)
c. O(ElogV)
d. O(logVE)
Answer: a
Solution: Refer to this link https://en.wikipedia.org/wiki/Edmonds%27_algorithm
Question 9:
For the below weighted directed graph G, which of the following are true with respect to
the maximum spanning tree, MST(G) ?
Answer: a, c
Solution: Solve by applying Chu-Liu-Edmonds Algorithm
Question 10:
Suppose you are training MST Parser for dependency and the sentence, “Einstein played
violin” occurs in the training set, where ‘Einstein’ and ‘violin’ are NOUNs and ‘played’ is a
VERB. For simplicity, assume that there is only one dependency relation, “rel”. Thus, for
every arc from word wi to wj, your features may be simplified to depend only on the
words wi and wj and not on the relation label.
The feature weights before the start of the iteration are: {10, 15, 8, 18, 2, 6, 20}.
Obtain the gold-standard MST from Question 9. Determine the weights after an iteration
over this example.
Answer: b
Solution: Refer to Week 6 Lecture 5
Predicted Tree: root -> Einstein -> played -> violin.
Gold standard Tree: root -> violin -> played, violin -> Einstein.