Pustejovsky 2006
Pustejovsky 2006
James Pustejovsky
Department of Computer Science
Brandeis University, Waltham, MA 02454
[email protected]
Abstract
In this paper, I explore the relation between methods of lexical
representation involving decomposition and the theory of types as
used in linguistics and programming semantics. I identify two major
approaches to lexical decomposition in grammar, what I call paramet-
ric and predicative strategies. I demonstrate how expressions formed
with one technique can be translated into expressions of the other.
I then discuss argument selection within a type theoretic approach
to semantics, and show how the predicative approach to decompo-
sition can be modeled within a type theory with richer selectional
mechanisms. In particular, I show how classic Generative Lexicon
representations and operations can be viewed in terms of types and
selection.
1 Introduction
In this paper, I examine the relation between the type of an argument as
selected by a predicate, and the role this argument subsequently plays in
the computation of the sentence meaning. The thesis that I will put forth
∗
I would like to thank Nicholas Asher, with whom I have been developing the Type
Composition Logic adopted here as the type theoretic interpretation of GL. I would also
like to thank Ray Jackendoff, Jose Castano, Roser Sauri, Patrick Hanks, and Chungmin
Lee for useful critical remarks. All errors and misrepresentations are, of course, my own.
1
is that there is an important connection between the nature of the type that
a predicate selects for as its argument, and the subsequent interpretation
of the predicate in the model. In order to understand this connection, I
explore the logical structure of decomposition as used in linguistic theory.
Two basic models of word meaning are discussed, parametric and predica-
tive decomposition. These are then compared to selection within a rich
type theory.
Type theoretic selection can be viewed as partial decomposition. The ad-
vantage over a full decomposition model such as predicative is that, in
defining a predicate, one is not forced to identify the distinguishing fea-
tures (as in Katz and Fodor) in the model. However, the types used as
assignments to the arguments of the predicate are a recognizable and dis-
tinguished subset of possible predications over individuals.
In the first two sections, I explore the relation between methods of
lexical representation involving decomposition and the theory of types
as used in linguistic semantics and programming semantics. I first dis-
tinguish two approaches to lexical decomposition in language, paramet-
ric and predicative decomposition. I demonstrate how expressions formed
with one technique can be translated into expressions of the other. I then
discuss argument selection within a type theoretic approach to semantics,
and show how type theory can be mapped to the predicative approach of
lexical decomposition. I argue that a type theoretic framework results in
an interpretative mechanism that is computationally more tractable than
with either atomic expressions or simple parametric decomposition. In
the final three sections, Generative Lexicon (GL) is illustrated as a con-
strained model of type selection and predicative decomposition. I outline
three basic mechanisms of argument selection for semantic composition,
and demonstrate how these mechanisms interact with the type system in
GL.
2
the past ten years has focused on the development of type structures and
typed feature structures. The selectional behavior of verbal predicates,
on this view, follows from the type associated with the verb’s arguments.
There is, however, a distinction in the way that verbs select their argu-
ments that has not been noticed, or if it has, has not been exploited for-
mally within linguistic theories; namely, argument structure and decom-
position are intimately connected and typically inversely related to one
another.
Before we examine the various models of lexical decomposition, we
need to address the more general question of what selection in the gram-
mar is, and what exactly the formal nature of an argument is. We begin
by reviewing informally what characteristics may comprise the predica-
tive complex that makes up a verb’s meaning. These include, but are not
limited to:
The question that I wish to address in this paper is the following: which
of these aspects can be abstracted as selectional restrictions to arguments,
and which of these can be abstracted as arguments in their own right? To
answer this question, I will first examine the role that lexical decomposi-
tion plays in the theory of grammar. I will characterize four approaches to
decomposition that have been adopted in the field, and illustrate what as-
sumptions each approach makes regarding selectional restrictions on the
arguments to a verb.
Linguists who do adopt some form of lexical decomposition do not
typically concern themselves with the philosophical consequences of their
enterprise. Still, it is hard to ignore the criticism leveled against the field by
Fodor and LePore (1998), who claim that any model of semantics involv-
ing decomposition is without support and leads to the anarchy of con-
ceptual holism. In fact, however, most linguists assume some kind of de-
3
compositional structure for the semantic representations associated with
lexical items, including, as it happens, Fodor and LePore themselves.1
How do we decompose the meaning of a verb? In order to catego-
rize the various techniques of decomposition, I will assume that a predica-
tive expression such as a verb has both a argument list and a body. This is
schematically illustrated in (2) below.
(2)
Args Body
z}|{ z}|{
λxi [Φ]
4
(4) a. λx[die(x)]
b. The flower died.
3
This is the θ-theory in varieties of Chomsky’s framework from the 1980s, and the
Functional Uniqueness Principle from LFG.
4
For the present discussion, I assume that the subpredicates in the expressions below
are related by means of standard first order logical connectives.
5
For each of these approaches, the representation adopted for the pred-
icate meaning will have consequences for the subsequent mapping of its
parameters to syntax, namely, the problem of argument realization. To
better illustrate the nature of these strategies, let us consider some exam-
ples of each approach, beginning with parametric decomposition. Within
this approach, the intuitive idea is to motivate additional parameters over
which a relation is evaluated in the model. These can be contextual vari-
ables, parameters identifying properties of the speaker, hearer, presuppo-
sitional information, and other pragmatic or domain specific variables.
Perhaps the most widely adopted case of parametric decomposition is
Davidson’s proposed addition of the event variable to action predicates
in language (Davidson, 1967). Under this proposal, two-place predicates
such as eat and three-place predicates such as give contain an additional
argument, the event variable, e, as depicted below.
(7) a. λyλxλe[eat(e)(y)(x)]
b. λzλyλxλe[give(e)(z)(y)(x)]
In this manner, Davidson is able to capture the appropriate entailments
between propositions involving action and event expressions through the
conventional mechanisms of logical entailment. For example, to capture
the entailments between (8b-d) and (8a) below,
In this example, each more specifically described event entails the one
above it by virtue of and-elimination (conjunctive generalization) on the
expression.
(9) a. ∃e[eat(e, m, the-soup)]
b. ∃e[eat(e, m, the-soup) ∧ with(e, a spoon)]
c. ∃e[eat(e, m, the-soup) ∧ with(e, a spoon) ∧ in(e, the kitchen)]
d. ∃e[eat(e, m, the-soup)∧with(e, a spoon)∧in(e, the kitchen)∧at(e, 3:00pm)]
There are of course many variants of the introduction of events into pred-
icative forms, including the identification of arguments with specific named
6
roles (or partial functions, cf. Dowty, 1989, Chierchia, 1989) such as the-
matic relations over the event. Such a move is made in Parsons (1980).5
Within AI and computational linguistics, parameter decomposition has
involved not only the addition of event variables, but of conventional ad-
junct arguments as well. Hobbs et al. (1993), for example, working within
a framework of first-order abductive inference, models verbs of change-
of-location such as come and go as directly selecting for the “source” and
“goal” location arguments. As a result, directional movement verbs such
as follow will also incorporate the locations as direction arguments.
Because some parameters are not always expressed, such a theory must
take into consideration the conditions under which the additional param-
eters are expressed. For this reason, we can think of parametric decompo-
sition as requiring both argument identification and argument reduction (or
Skolemization) in the mapping to syntax. That is, something has to ensure
that an argument may be elided or must be expressed.
We turn next to simple predicative decomposition. Perhaps the best known
examples of lexical decomposition in the linguistics literature are the com-
ponential analysis expressions proposed in Katz and Fodor (1963). Under
this strategy, concepts such as bachelor are seen as conjunctions of more
“primitive” features:6
5
The neo-Davidsonian position adopted by Kratzer (1994) does not fall into this cate-
gory, but rather in the supralexical decomposition category below. Reasons for this will
become clear in the discussion that follows.
6
Whether the concept of married is any less complex than that of the definiendum
bachelor has, of course, been a matter of some dispute. Cf. Weinreich (1972).
7
(12) ∀x[bachelor(x) =⇒ [male(x) ∧ adult(x) ∧ ¬married(x)]]
Again, using our simple args-body description of the expression, the pred-
icative content in the body of (13) has become more complex, while leaving
the arguments unaffected, both in number and type. The mapping to syn-
tax from a simple predicative decomposition structure can be summarized
as the following relation:
8
(15) a. kill:
λyλxλe1 λe2 [act(e1 , x, y) ∧ ¬dead(e1 , y) ∧ dead(e2 , x) ∧ e1 < e2 ]:
b. The gardener killed the flower.
Thus, in the sentence in (18), the external argument along with the seman-
tics of agency and causation are external to the meaning of the verb build.
This view has broad consequences for the theory of selection, but I will not
discuss these issues here as they are peripheral to the current discussion.
9
3 Types and the Selection of Arguments
Having introduced the basic strategies for semantic decomposition in pred-
icates, we now examine the problem of argument selection. We will dis-
cuss the relation between selection and the elements that are assumed as
part of the type inventory of the compositional semantic system.
In the untyped entity domain of classical type theory as conventionally
adopted in linguistics (e.g., Montague Grammar), determining the condi-
tions under which arguments to a relation or function can “be satisfied” is
part of the interpretation function over the entire expression being eval-
uated. The only constraint or test performed prior to interpretation in
the model is the basic typing carried by a function. For example, to de-
termine the interpretations of both sentence (19a) and (19b), the interpre-
tation function, [[.]]M,g tests all assignments according to g within the model
M.
Hence, our assignment and model will determine the correct valuation
for the proposition in (19a). As it happens, however, there will be no as-
signment that satisfies (19b) in the model. We, of course, as speakers of
language, intuit this result. The model does not express this intuition,
but does evaluate to the same answer. The valuation may always be cor-
rect (the correct truth-value universally assigned), but the computation
required to arrive at this result might be costly and unnecessary: costly be-
cause we must evaluate every world within the model with the appropri-
ate assignment function; and unnecessary because the computation could
effectively be avoided if our system were designed differently.
This can be accomplished by introducing a larger inventory of types
and imposing strict conditions under which these types are accepted in a
computation. A richer system of types works to effectively introduce the
test of “possible satisfaction” of an argument to a predicate. The types in
the entity domain encode the possible satisfaction of the argument. We
can think of argument typing as a pre-test. If an expression fails to past
the pretest imposed by the type, it will not even get interpreted by the
10
interpretation function.8 This is what we will call a “fail early” selection
strategy. Hence, the domain of interpretation for the expression is reduced
by the type restriction.
In the discussion above, we distinguished the argument list from the
body of the predicate. To better understand what I mean by a “fail early”
strategy of selection, let us examine the computation involved in the in-
terpretation of a set of related propositions. Consider the following sen-
tences.
Imagine tracing the interpretation of each sentence above into our model.
Given a domain, for each sentence, the assignment function, g, and in-
terpretation function, I results in a valuation of each sentence. What is
notable about the sentences in (20), is that the trace for each sentence
will share certain computations towards their respective interpretations.
Namely, the argument bound to the subject position in each sentence is
animate. How is this common trace in the interpretation of these predi-
cates represented, if at all, in the grammar? 9
Consider the λ-expression for a two-place predicate, Φ, which consists
of the subpredicates Φ1 , . . . , Φk . The variables are typed as individuals,
i.e., e, and the entire expression is therefore a typical first-order relation,
typed as e → (e → t).
(21)
Args Body
z }| { z }| {
λx2 λx1 [Φ1 , . . . Φk ]
8
In programming languages, the operation of semantic analysis verifies that the typ-
ing assignments associated with expressions are valid. This is essentially done in compi-
lation time, as a pre-test, filtering out arguments that would otherwise have the wrong
type. In a model that does not perform predicate decomposition to incorporate typing
constraints, sentences like (19b) are just false.
9
Regarding argument selection, there are two possible strategies for how the argu-
ment accommodates to the typing requirement. Given that the type requirement is a
pretest, the argument expression can fail (strict monomorphic typing), or coerce to the
appropriate type (polymorphic typing). We will not discuss coercion in the context of the
fail early strategy in this paper.
11
A richer typing structure for the arguments would accomplish three things:
(1) it acts to identify specific predicates in the body of the expression that
are characteristic functions of a given argument;
(22)
τ σ
z}|{ z}|{
λx2 λx1 [Φ1 , . . . Φx1 , . . . Φx2 , . . . , Φk ]
(23)
λx2 λx1 [Φ1 , . . . , Φk − {Φx1 , Φx2 }]
and (3) it takes the set of predicates associated with each argument and
reifies them as type restrictions on the λ-expression, i.e., as the types τ and
σ.
(24)
λx2 : σ λx1 : τ [Φ1 , . . . , Φk − {Φx1 , Φx2 }]
12
(26)
Entity
HH
H
HH
Physical Abstract
H H
HH
H
animate inanimate Mental Ideal
13
A PPROACH Type Expression
atomic e→t λx[sleep(x)]
predicative e→t λx[animate(x) ∧ sleep(x)]
enriched typing anim → t λx : anim[sleep(x)]
Similar remarks hold for the semantics of nouns, and in particular, the
predicative decomposition of relational nouns (cf. Borschev and Partee,
2001) and agentive nouns (Busa, 1996).
In the remainder of this paper, I will examine in more detail the conse-
quences of enriching the inventory of types. First, however, we examine
what linguistic motivations exist for such a move.
14
available to that type in the grammar. What this says is that there is a direct
mapping from semantic representations and their types to specific syntac-
tic effects. Specifically, it states that such a mapping must be a property
of semantic categories generally, and not merely selectively. The thesis as
stated may in fact be too strong, and indeed there appear to be areas of
grammar where direct semantic transparency seems to fail (such as the
syntactic realization of mass and count terms cross-linguistically). Never-
theless, I will adopt semantic transparency to help structure our intuitions
regarding the linguistic modeling of types for selection in grammar.
The standard theory of selection in grammar can be viewed as follows.
There is some inventory of types, T , associated with the entities in the do-
main, along with t, a Boolean type. Verbs are analyzed as functional types,
meaning that they are functions from this set of types to t (i.e., employing
a functional type constructor such as →). The selectional constraints im-
posed on the arguments to a verb are inherited from the type associated
with that argument in the functional type that the verb carries. This is gen-
erally quite weak and if any further constraints are seen as being imposed
on the semantics of an argument, then they would be through some notion
of selectional constraints construed as a presupposition during interpreta-
tion.
The approach taken here differs from the standard theory in two re-
spects. First, we will aim to make the selectional constraints imposed
on a verb’s arguments transparently part of the typing of the verb itself.
This entails enriching the system of types manipulated by the composi-
tional rules of the grammar. Following Pustejovsky (2001), I will assume
the theory of type levels, where a distinction is maintained between natu-
ral, artifactual, and complex types for all major categories in the language.
Secondly, the mechanisms of selection available to the grammar are not
simply the application of a function to its argument (function application,
argument identification, θ-discharge), but involve three type-sensitive op-
erations: type matching, coercion, and accommodation. These will be intro-
duced in subsequent sections.
15
1989), presupposing the discussion of the problem as presented in Putnam
(1975) and Kripke (1980). Although the problem emerges in a superficial
manner in the semantics and knowledge representation literature (Fell-
baum, 1998), there is surprisingly little discussion of the conceptual under-
pinnings of natural kinds and how this impacts the linguistic expression
of our concepts. This section addresses the linguistic and conceptual con-
sequences of the notion of natural kind. Particularly, I will examine what
it means, from the perspective of linguistic modeling, for the grammar to
make reference to a natural or unnatural kind in the conceptual system.
The world of entities inherited from Montague’s theory of semantics is,
in many respects, a very restricted one. In that model, there is no princi-
pled type-theoretic distinction made between the kinds of things that exist
within the domain of entities. Similarly, the only distinctions made in the
domain of relations pertains mostly to the number of arguments a relation
takes, or the intensional force introduced over an argument (cf. Dowty et
al, 1981, Heim and Kratzer, 1998). Many enrichments and modifications
have been made to this model over the past thirty years, including the ad-
dition of stages and kinds (cf. Carlson, 1977), but interestingly enough, no
extensions have ever been made for modeling natural kinds.
From a linguistic point of view, this might not seem surprising, since
the grammatical behavior of natural kind terms doesn’t noticeably distin-
guish itself from that of other nominal classes. In fact, there has never
been sufficient evidence presented for making such a grammatical distinc-
tion. Consider, for example, the sentences in (29) below. The natural kind
terms dog, man, and bird behave no differently as nominal heads than the
artifactual nouns pet, doctor, and plane.
16
In this section, however, I discuss three linguistic diagnostics which ap-
pear to motivate a fundamental distinction between natural and unnatural
kinds. These diagnostics are:
17
(34) a. !This box is large and small.
b. !Your gift is round and square.
While it is true that pianists are humans, this subtyping relation is different
from that with musicians in (37b). We return to this distinction below in
the next section.
While natural kinds terms seem to distinguish themselves from other
sortal terms with nominal predicative constructions, the same holds for
certain adjectival predications as well. Consider the adjectival modifica-
tions in (39), with natural kind terms as head.
18
(39) a. very old gold
b. a new tree
c. a young tiger
d. such a beautiful flower
With the NPs in (40), observe that the adjectives can modify aspects of the
nominal head other than the physical object: blue in (40a) can refer to the
color of the object or the color of the ink; bright in (40b) most likely refers
to the bulb when illuminated; and long in (40c) can refer only to the length
of time a CD will play.12
Turning to the agentive nominal heads in (41), a similar possibility of
dual adjectival modification exists. The adjective old in (41a) can refer to
the individual as a human or the friendship; good in (41b) can refer to
teaching skills or humanity; and beautiful in (41c) can refer to dance tech-
nique or physical attributes.
From this brief examination of the data, it is clear that not all kind
terms are treated equally in nominal predication and adjectival modifi-
cation. As a final diagnostic illustrating grammatical distinctions between
natural and unnatural kind terms, let us consider the selection of NPs in
11
This class of adjectives has been studied extensively. Bouillon (1997) analyzes such
constructions as subselective predication of a qualia role in the head. Larson and Cho
(2003) provide a more conventional interpretation without the need for decompositional
representations.
12
In both (40b) and (40c), interpretations are possible with modification over the object,
but they are semantically marked with bright and contradictory with long.
19
type coercive contexts. Verbs that select for multiple syntactic frames for
the same argument can be viewed as polymorphic predicates. In Puste-
jovsky (1993, 1995), it is argued that predicates such as believe and enjoy, as
well as aspectual verbs such as begin and finish can coerce their arguments
to the type they require. For example, consider the verb-object pairs in
(42)-(43):
(42) a. Mary enjoyed drinking her beer.
b. Mary enjoyed her beer.
(43) a. John began to write his thesis.
b. John began writing his thesis.
c. John began his thesis.
Although the syntactic form for each sentence is distinct, the semantic type
selected for by enjoy and begin, respectively, remains the same. For the
readings in (42b) and (43c), following Pustejovsky (1995), we assume that
the NP has undergone a type coercion operation to the type selected by
the verb. For example, in (43c), the coercion “wraps” the meaning of the
NP “his thesis” with a controlled event predicate, in this case defaulting
to “writing”.
What is interesting to note is that artifactual nouns seem to carry their
own default interpretation in coercive contexts. This property is com-
pletely absent with natural kind terms, however, as shown below.
(44) a. !John finished the tree.
b. !Mary began a tiger.
There are, of course, legitimate readings for each of these sentences, but
the interpretations are completely dependent on a specific context. Unlike
in the coercions above, natural kinds such as tree and tiger carry no prior
information to suggest how they would be “wrapped” in such a context.
In sum, we have discovered three grammatical diagnostics distinguish-
ing natural kind terms from non-natural kind terms. They are:
(45) a. Nominal Predication: How the common noun behaves predica-
tively;
b. Adjectival Predication: How adjectives modifying the the common
noun can be interpreted;
c. Interpretation in Coercive Contexts: How NPs with the common
noun are interpreted in coercive environments.
20
Given this evidence, it would appear that natural kinds should be typed
distinctly from the class of non-naturals in language. The latter, however,
is itself heterogeneous, and deserves further examination. As explored
in Pustejovsky (2001), there are specific and identifiable diagnostics in-
dicating that the class of non-natural entities divides broadly into two
classes, what I call artifactual types and complex types. Because this distinc-
tion largely mirrors that made in Pustejovsky (1995) between unified and
complex types, I will not review the linguistic motivations in this paper.
In the next section, I show how the representations and mechanisms of
Generative Lexicon (GL) theory can account for these distinctions. These
facts can be accounted for by establishing a fundamental distinction be-
tween natural types and non-natural types within our model. We first
review the basics of GL and then present our analysis.
21
(46) a. L EXICAL T YPING S TRUCTURE: giving an explicit type for a word
positioned within a type system for the language;
b. A RGUMENT S TRUCTURE: specifying the number and nature of
the arguments to a predicate;
c. E VENT S TRUCTURE: defining the event type of the expression and
any subeventual structure it may have; with subevents;
d. Q UALIA S TRUCTURE: a structural differentiation of the predica-
tive force for a lexical item.
α
2 3
6 2 3 7
ARG 1 x5
6 7
=
6
6 ARGSTR = 4
7
7
6
6 ... 7
7
6 2 3 7
E1 e1 5
6 7
=
6 7
6
6 EVENTSTR = 4 7
7
6
6 ... 7
7
6 2 3 7
CONST = what x is made of
6 7
6 7
6 6 7 7
FORMAL = what x is
6 6 7 7
6 7
QUALIA
6 7
6 = 6 7 7
TELIC = function of x
6 6 7 7
6 6 7 7
6 7
AGENTIVE = how x came into being
4 4 5 5
It is perhaps useful to analyze the above data structure in terms of the args-
body schema discussed in previous sections. The argument structure (AS)
captures the participants in the predicate, while the event structure (ES)
22
captures the predicate as an event or event complex of a particular sort
(Pustejovsky, 2001). The body is composed primarily of the qualia struc-
ture together with temporal constraints on the interpretation of the qualia
values, imposed by event structure. This is illustrated schematically be-
low, where QS denotes the qualia structure, and C denotes the constraints
imposed from event structure.
(48)
Args
z }| {
AS ES Body: QS ∪ C
z }| { z }| { z }| {
λxn . . . λx1 λem . . . λe1 [Q1 ∧ Q2 ∧ Q3 ∧ Q4 ; C]
Given this brief introduction to GL, let us return to the problem of argu-
ment selection. I propose that the selection phenomena can be accounted
for by both enriching the system of types and the mechanisms of com-
position. I will propose three mechanisms at work in the selection of an
argument by a predicative expression. These are:
23
The level of a type will be modeled by its structure, following Asher
and Pustejovsky’s (2001, 2005) Type Composition Logic. The set of types is
defined in (51) below.
24
The additional qualia values can be seen as structural complementa-
tion to the head type. Each quale value will be introduced by a tensor
operator, ⊗. To differentiate the qualia roles, we will subscript the opera-
tor accordingly; e.g., [TELIC = τ ] can be expressed as ⊗T τ , [AGENTIVE = σ]
can be expressed as ⊗A σ.
Now the feature structure for the expression α from (52) can be repre-
sented as a single composite type, as in (53), or written linearly, as β ⊗T
τ ⊗A σ.
α: β
(53) ⊗T τ
⊗A σ
These will be our atomic types, from which we will construct our ⊗-types
and •-types (artifactual and complex types, respectively).
We will assume that the natural entity types, N , are just those entities
formed from the Formal qualia value i.e., atomic types. The natural types
are formally structured as a join semi-lattice (Pustejovsky, 2001), hN , vi
(cf. the structure in (26)).
Now consider the predicates that select for just these natural types.
Once natural type entities have been defined, we are in a position to de-
fine the natural predicates and relations that correspond to these types.
The creation of functions over the sub-domain of natural types follows
conventional functional typing assumptions: for any type τ in the sub-
domain of natural types, τ ∈ N , τ → t is a natural functional type.
First, let us review some notation. I assume a typing judgment, g ` α : τ ,
with respect to a grammar to be an assignment, g, an expression, α, and a
type, τ , such that under assignment g, the expression α has type τ . In the
case of the natural types, I will also assume the following equivalence:
25
(55) g ` x : τ ∈ N =df g ` x : en
Hence, all of the predicates below are considered natural predicates, since
each is a functional type created over the sub-domain of natural entities.13
(56) a. die: eN → t
b. touch: eN → (eN → t)
c. be under: eN → (eN → t)
It is interesting to compare this to Anscombe’s (1958) discussion and Searle’s (1995) exten-
sion regarding “brute facts” as opposed to “institutional facts.”. The natural predication
of a property over a natural entity is a judgment requiring no institutional context or
background. Facts (or at least judgments) can be classified according to the kinds of par-
ticipant they contain; in fact, as we shall see, the qualia and the principle of type ordering
will allow us to enrich this “fact classification” even further.
26
typist, water, and wine, all have distinct qualia structures reflecting their
conceptual distinctions. This has always been at the core of GL’s view of
lexical organization. What I wish to do here is demonstrate how these
differences are accounted for directly in terms of the structural typing in-
troduced above.
In the previous section, natural entities and natural functions were de-
fined as the atomic types, involving no ⊗- or •-constructor syntax. Arti-
factual objects, that is, entities with some function, purpose, or identified
origin, can now be constructed from the tensor constructor and a specific
value for the TELIC or AGENTIVE role. I will adopt the term artifact, in a
broad sense, to refer to artifactually constructed objects, or natural objects
that have been assigned or assume some function or use.14 Following the
discussion above, then, composing a natural entity type, eN , with a Telic
value by use of the ⊗-constructor results in what we will call an artifactual
type.15
As it stands, the definition in (58) is not general enough to model the set
of all artifacts and concepts with function or purpose. As argued in Puste-
jovsky (1995), the head type (the FORMAL quale role) need not be an atomic
type (natural), but can be arbitrarily complex itself. As a result, we will
broaden the type for the head to include artifactual types as well:
14
Dipert makes a similar move in his 1993 book Artifacts, Art Works, and Agency.
15
The judgments expressed by the predication of an artifactual predicate of an artifac-
tual subject results in an artifactual proposition. This is formally similar to Searle’s notion
of institutional fact.
27
(60) ARTIFACTUAL T YPE (Final Version): For an expression α, whose
head type, β ∈ N ∪A, and any functional type γ, the ⊗R -construction
type, β ⊗R γ, is in the sub-domain of artifactual types, A.
As with the naturals, the creation of functions over the sub-domain of ar-
tifactual types is straightforward: for any type τ in the sub-domain of ar-
tifactual entity types, τ ∈ A, τ → t is a artifactual functional type. Below are
some examples of such functional types, expressed as λ-expressions with
typed arguments:
(63) C OMPLEX T YPE: For any entity types α, β ∈ N ∪A, the •-construction
type, α • β, is in the sub-domain of complex types, C.
28
Creating functions over the sub-domain of complex types is similarly
straightforward: for any type τ in the sub-domain of complex entity types,
τ ∈ C, τ → t is a complex functional type. Below is an example of the verb
read, a complex functional type, since it selects a complex type as its direct
object.
How exactly this is accomplished we will explain below. In the next sec-
tion, we turn finally to the mechanisms of selection at work in ensuring
that predicates and their arguments are compatible in semantic composi-
tion.
6 Mechanisms of Selection
In this section, we examine the compositional processes at work in com-
municating the selectional specification of a predicate to its arguments. In
particular, we analyze domain-preserving selection between a predicate
and its arguments. As a result, we will not discuss type-shifting rules
across domains, such as the classic type coercion rules invoked in aspec-
tual and experiencer verb complementation contexts (e.g., enjoy the beer,
finish the coffee). How these operations are analyzed in terms of the com-
positional mechanisms presented here is described elsewhere (cf. Puste-
jovsky, 2006).
There are three basic mechanisms available in the grammar for medi-
ating the information required by a predicate, F , and that presented by
the predicate’s argument. For a predicate selecting an argument of type σ,
[ ]σ F , the following operations are possible:
29
(66) a. P URE S ELECTION: The type a function requires of its argument,
A, is directly satisfied by that argument’s typing:
[ Aα ] α F
[ Aβ ]α F, α u β 6= ⊥
[ Aατ ]β F, α v β
ii. Introduction: wrapping the argument with the type the func-
tion requires:
[ Aα ]βσ F, α v β
The table below illustrates what operations are available in which selec-
tional contexts. Obviously, pure selection is only possible when both the
type selected and the argument type match exactly. Also, accommodation is
operative only within the same type domain.
The remaining cases are varieties of coercion: exploitation is present
when a subcomponent of the argument’s type is accessed; and introduc-
tion is operative when the selecting type is richer than the type of its argu-
ment.16
16
It might be possible to view pure selection as incorporating the accommodation rule
as well, which would result in a more symmetric distribution of behavior in the table.
Whether this is computationally desirable, however, is still unclear.
30
Type Selected
Argument Type Natural artifactual Complex
(67) Natural Sel/Acc Intro Intro
artifactual Exploit Sel/Acc Intro
Complex Exploit Exploit Sel/Acc
To better understand the interactions between these operations, let us
walk through some examples illustrating each of these selectional opera-
tions. We start with the set of predicates selecting for a natural type ar-
gument. Consider the intransitive verb fall, as it appears with natural,
artifactual, and complex arguments, respectively. The typing on the head
noun for each example is given in parentheses.
(69) S
H
HH
phys H
NP VP
phys
V
the rock
fell
λx : eN [fall(x)]
For the second and third examples, exploitation applies to provide access
to the physical manifestation of the type appearing in the argument posi-
tion. Below is the derivation for (68c); the exploitation in (68b) is similarly
derived.17
17
Exploitation on the inf o element of the dot object for book occurs in examples such as
(i) below:
(i) I don’t believe this book at all.
Here the verb is selecting for propositional content, which is present by exploitation in
the dot object of the direct object.
31
(70) S
H
HH
phys H
NP VP
phys • inf o
V
the book
fell
λx : eN [fall(x)]
Consider first the case of pure selection in (71b). Here the predicate is
selecting for an artifactual entity as subject, and the NP present is typed as
one. Hence, the typing requirement is satisfied.
(72) S
HH
H
σ ⊗T τ H
NP VP
phys ⊗T eat
V
the food
spoiled
λx : σ ⊗T τ [spoil(x)]
32
(73) S
H
HH
σ ⊗T τ H
NP VP
liquid
V
the water
spoiled
λx : σ ⊗T τ [spoil(x)]
In this case, sentence (74c) is the example of pure selection. The predicate
read requires a dot object of type phys • inf o as its direct object, and the NP
present, the book, satisfies this typing directly. This is shown in (75) below,
where p • i abbreviates the type phys • inf o.
(75) VP
H
HH
p•i H
V - NP:phys • inf o
HH
H
H
read Det N
λy : p • i λx : eN [read(x,y)]
the book
For all of the other cases, (74a), (74a’), and (74b), the NP in direct object
position is wrapped with the intended type by the rule of Introduction, as
shown below for sentence (74a).
33
(76) VP
H
HH
phys • inf o- H NP:inf o
V
HH
H
H
read Det N
λy : p • i λx : eN [read(x,y)]
the rumor
7 Conclusion
In this paper, I have examined the relationship between decomposition
and argument typing in semantics. What emerges from the interplay of
these two formal strategies is a clearer understanding of some of the mech-
anisms of compositionality in language. I outlined a model of argument
selection for natural language involving two major components: a three-
level type system consisting of natural, artifactual, and complex types;
and three compositional mechanisms for mediating the type required by
a predicate and the type present in the argument. These are: pure selec-
tion (matching), accommodation, and coercion. There are two kinds of
coercion, exploitation and introduction, and we illustrated each of these
operations at work in the syntax.
34
8 References
Anscombe, G.E.M. 1958. “On Brute Facts”. Analysis 18: 69-72.
Asher, N. and Pustejovsky, J. (2005). “Word Meaning and Commonsense
Metaphysics”. ms. Brandeis University and University of Texas.
Asher, N. and Pustejovsky, J. (this volume). “A Type Composition Logic
for Generative Lexicon.”
Borschev, V. and B. Partee. 2001. Genitive modifiers, sorts, and metonymy.
Nordic Journal of Linguistics, 24(2):140-160,.
Bouillon, P. (1997). Polymorphie et sémantique lexicale : le cas des adjectifs,
Lille: Presses Universitaires du Spetentrion.
Busa, F. (1996). Compositionality and the Semantics of Nominals, Ph.D. Dis-
sertation, Brandeis University.
Busa, F., Calzolari, N., Lenci, A. and Pustejovsky, J. (1999). Building a
Semantic Lexicon: Structuring and Generating Concepts. Proceedings of
IWCS-III, Tilberg, The Netherlands.
Chierchia, G. 1989. “Structured Meanings, Thematic Roles, and Control,”
in G. Chierchia, B. Partee, and R. Turner, eds., Properties, Types, and Mean-
ing, vol. 2, Kluwer Academic Publishers, Dordrecht, The Netherlands, 131-
166.
Copestake, A. and T. Briscoe 1992. “Lexical Operations in a Unification-
based Framework,” in J. Pustejovsky and S. Bergler, eds., Lexical Semantics
and Knowledge Reperesentation, Springer Verlag, Berlin.
Corblin, F. (2003) “Presuppositions and commitment stores”, in Diabruck,
7th Workshop on the Semantics and the Pragmatics of Dialogue, 2003.
Cruse, D. A. 1986. Lexical Semantics, Cambridge University Press.
Davis, A., and J.-P. Koenig (2000). “Linking as constraints on word classes
in a hierarchical lexicon,” Language, 76.
Dipert, R. R. 1993. Artifacts, Art Works, and Agency, Philadelphia: Temple
University Press.
Dowty, D. R. (1979). Word Meaning and Montague Grammar, Dordrecht:
Kluver Academic Publishers
35
Dowty, D. R. 1989. “On the Semantic Content of the Notion ‘Thematic
Role’,” in G. Chierchia, B. Partee, and R. Turner, eds. Properties, Types, and
Meaning, Vol. 2, Semantic Issues, Dordrecht.
Fodor, J. (1998) Concepts, Oxford University Press.
Fodor, J. and Lepore, E. (1998). The Emptiness of the Lexicon: Critical
Reflections on J. Pustejovsky’s The Generative Lexicon. Linguistic Inquiry
Gazdar, G. , E. Klein, G. Pullum, and I. Sag. 1985. Generalized Phrase Struc-
ture Grammar, Harvard University Press.
Goldberg, A. (1995) Constructions: A Construction Grammar Approach to Ar-
gument Structure, University of Chicago Press.
Gupta, A. (1980). The Logic of Common Nouns, Yale University Press, New
Haven.
Heim, I. and A. Kratzer. 1998. Semantics in Generative Grammar, Blackwell.
Hobbs, J., M. Stickel, P. Martin, D. Edwards. 1993. “Interpretation as Ab-
duction,” Artificial Intelligence 63:69-142.
Jackendoff, R. (2002) Foundations of Language, Oxford University Press, Ox-
ford.
Kamp, H. and B. Partee. 1995. “Prototype Theory and Compositionality”,
Cognition, Volume 57, Number 2, November 1995, pp. 129-191.
Katz, J. and J. Fodor. 1963. “The Structure of a Semantic Theory,” Language
39:170-210.
Kratzer, A.. (1994). On external arguments. In Functional Projections, eds.
Jeffrey Runner and Elena Benedicto, 103-129. Amherst, Ma.: GLSA
Lakoff G. 1965/1970. Irregularity in Syntax. Holt, Rinehart, and Winston.
Larson, R. and S. Cho. 2003. “Temporal Adjectives and the Structure of
Possessive DPs”. Natural Language Semantics 11:3.
Levin, B. and M. Rappaport Hovav. 1995. Unaccusatives: At the Syntax-
Lexical Semantics Interface, MIT Press, Cambridge, MA.
Levin, B. and M. Rappaport, (2005) Argument Realization, Cambridge: Cam-
bridge University Press.
Lyons, J. (1968) Introduction to Theoretical Linguistics, Cambridge University
Press.
36
McCawley, J. D. 1968. “The Role of Semantics in a Grammar,” in E. Bach
and R. T. Harms (eds.), Universals in Linguistic Theory, Holt, Rinehart, and
Winston, New York, NY.
Marantz, A.P. (1984) On the Nature of Grammatical Relations, MIT Press,
Cambridge, MA.
Moravcsik, J. M. 1975. “Aitia as Generative Factor in Aristotle’s Philoso-
phy,” Dialogue 14:622-36.
Parsons, T. (1990) Events in the Semantics of English, MIT Press, Cambridge,
MA.
Partee, B. 1992. “Syntactic Categories and Semantic Type,” in M. Rosner
and R. Johnson (eds.), Computational Linguistics and Formal Semantics, Cam-
bridge University Press
Partee, B. and V. Borshev. 2003. “Genitives, relational nouns, and argument-
modifier ambiguity.” In C. Fabricius-Hansen E. Lang C. Maienborn, edi-
tor, Modifying Adjuncts, Interface Explorations. Mouton de Gruyter.
Pinker, S. 1989. Learnability and Cognition: The Acquisition of Argument
Structure, MIT Press, Cambridge, MA.
Pollard, C. and I. Sag. 1994. Head-Driven Phrase Structure Grammar, Uni-
versity of Chicago Press and Stanford CSLI, Chicago.
Pustejovsky, J. 1991a. “The Generative Lexicon,” Computational Linguistics,
17:409-441.
Pustejovsky, J. (1995). The Generative Lexicon, Cambridge, MA: MIT Press.
Pustejovsky, J. (2001) “Type Construction and the Logic of Concepts”, in
P. Bouillon and F. Busa (eds.) The Syntax of Word Meaning, Cambridge
University Press, Cambridge.
Pustejovsky, J. (2006). Language Meaning and The Logic of Concepts, MIT
Press.
Pustejovsky, J. 2005. “Natural Kinds in Linguistics”, in Festschrift for Chung-
min Lee, Hankook Munhwasa Publishers.
Pustejovsky, J. and B. Boguraev. 1993. “Lexical Knowledge Representation
and Natural Language Processing,” Artificial Intelligence 63:193-223.
Searle, J. (1995) The Construction of Social Reality, Free Press.
37
Van Valin, R. (2005) Exploring the Syntax-Semantics Interface, Cambridge,
Cambridge University Press.
Weinreich, U. 1972. Explorations in Semantic Theory, Mouton, The Hague.
38