0% found this document useful (0 votes)
190 views44 pages

14 - Chapter II PDF

This chapter provides an overview of lexical semantics and its history. It discusses key concepts in semantics like denotative and connotative meanings. Lexical semantics focuses on the meaning of individual words and lexemes. It examines how meaning is represented and related between words through paradigmatic relations like synonymy and antonymy, and syntagmatic relations. The history of lexical semantics progressed through four phases - prestructuralist approach from 1830-1930 focused on psychological and diachronic analyses of meaning change, structuralist approach from 1930 analyzed meaning through synchronic and autonomous linguistic structures, and neostructuralist approach from 1990 incorporated cognitive perspectives.

Uploaded by

lucia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
190 views44 pages

14 - Chapter II PDF

This chapter provides an overview of lexical semantics and its history. It discusses key concepts in semantics like denotative and connotative meanings. Lexical semantics focuses on the meaning of individual words and lexemes. It examines how meaning is represented and related between words through paradigmatic relations like synonymy and antonymy, and syntagmatic relations. The history of lexical semantics progressed through four phases - prestructuralist approach from 1830-1930 focused on psychological and diachronic analyses of meaning change, structuralist approach from 1930 analyzed meaning through synchronic and autonomous linguistic structures, and neostructuralist approach from 1990 incorporated cognitive perspectives.

Uploaded by

lucia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

CHAPTER 2

AN OVERVIEW OF LEXICAL SEMANTICS

2.1. Introduction

This chapter presents an overview of lexical semantics and examines the history

of this field in a gist. The various definitions and concepts of lexical semantics are

reviewed from various scholars like Cruse (1986, 2000, 2011), Palmer (1981),

Lyons (1977, 1986),Pustejovsky (1995), Saeed (1997), Jeffries (1998), Murphy

(2003) and Zimmermann and Sternefeld (2013), especially the ones that is

important for the various chapters that describes the semantics of A∙we.

The main notion of this chapter is to specify the meaning of a word or a

lexeme. Lexical semantics deals with the meaning of individual words and also

with the normal condition of the meaning of lexical items. It also sometimes deals

with grammatical condition of meaning in a particular language. Cruse (1986)

presumed that “the semantic properties of a lexical item are fully reflected in

appropriate aspects of relations it contrasts with actual and potential contexts”

(p.1).

In this chapter, the different aspects of meaning which includes the

definition and the description of Semantics and the different types of Semantics

where the focus will mostly be on Lexical Semantics, the birth of Lexical

Semantics, the description of Sense Relations and also the previous works on

Lexical Semantics are discussed.

40
Semantics is one of the important branches of linguistics that deals with

interpretation and meaning of the words, syntax (sentence structure) and symbols,

while determining the reading comprehension of the readers how they understand

and comprehend others and their elucidations or interpretation. Furthermore,

semantics constructs a relation between adjoining words and clarifies the sense of

a sentence whether the meanings of words are literal or figurative.

Crystal (1987, p.102) illustrates it as “Semantics is not directly concerned

with the study of the external world, or its conceptualization. The world of non-

linguistic experience is the province of physicists, geographers, psychologists, and

others. Nor is semantics easily able to cope with the study of how language refers

to this external world. Rather, the primary focus of the modern subject is on the

way people relate words to each other within the framework of their language - on

their „sense‟, rather than their reference.”

The branch of linguistics which studies meaning is known as Semantics.

Lyons (1988) defines semantics as the study of meaning. Semantics, as a branch

of linguistics did not exist as a distinct field during the eighteenth century and

only during the late nineteenth century, this branch of linguistics made little

progress although people had been interested in the question of meaning since

centuries back. It is arguably the area of linguistics which is closely related to the

philosophy of language. According to Malmkjᴂr (1991, p. 522), “the main

difference between the linguist‟s and the philosopher‟s way of dealing with the

question of meaning is that the linguist tends to concentrate on the way in which

meaning operates in language, while the philosopher is more interested in the

nature of meaning itself”. Thus, the study of meaning had its ups and downs in the

41
history of linguistics. Michel Bréal, a French linguist, made a serious and largely

successful attempt to introduce semantics into European linguistic work (Trask

1999, p.178). Bréal also coined the term “Semantics”.

Lexical Semantics which is the focus of this research is concerned with the

study of meaning of lexical items. It is also concerned with the identification and

representation of the semantics of lexical items. Lexical items mean the individual

words and the meaning it possesses in a sentence or as a word itself. Thus, lexical

semanticists are interested in the what and why of the individual lexical items –

what the individual lexical items mean, why they mean something, what they

mean to represent, what they do, how we can represent all of this, how they are

represented in the speakers‟ minds and how they are used in text and discourse

and where the combined interpretation for an utterance comes from.

Lexical semantics includes paradigmatic relations of identity and inclusion

such as synonymy, hyponymy and meronymy and also paradigmatic relation of

exclusion and oppositeness such as antonyms, opposites and complementaries;

syntagmatic relations of meaning; taxonomic hierarchies and process of meaning

extension like metaphor and metonymy.

2.1.1. Types of Semantics

2.1.1.1. Connotative Semantics

This type of semantics is concerned with the study of figurative meaning. It shows

a set of associations such as imaginative or emotional suggestion connected with

words. The readers can relate to such associations and therefore words used in

poems are usually connotative in nature (Leech, 1981).

42
2.1.1.2. Denotative Semantics

The grammatical symbols such as exclamation mark, quotation mark, apostrophe,

colon etc. are used in denotative semantics to suggest the literal, explicit or

dictionary meanings of the words by using associated meaning (Leech, 1981).

2.2. History of Lexical Semantics

The onset of Lexical Semantics in its first stages started from around 1830 to 1930

thus starting the first stage of the History of Lexical Semantics. The main concern

of Lexical Semantics lies in the change of word meaning of a lexeme, its

identification, classification and explanation of semantic changes. This resulted in

theoretical proposals and empirical descriptions due to the research based on

Lexical Semantics.

The nineteenth century marked the onset of Lexical Semantics making its

way as a part of the academic discipline but it does not mean that the discussion

on Lexical Semantics was not done earlier. Lyons (1977) and Cruse (1986)

formed a pathway for the development of lexical semantics and therefore a

remarkable change could be observed since the publication of their classic texts

on Lexical Semantics. Their texts were written during the Structuralists

Semanticists‟ period and thus, Lexical Semantics was carried out separately from

major Generative theories of grammar but since the 1980s, theories of grammar

have become more lexically-driven thus creating more attention to issues related

to lexical meaning.

43
The different phases of Semantics have been identified under four phases and

the origin of semantics dates back to the ancient times during Aristotle and Plato.

The four phases of Semantics have been dealt separately instead of clubbing it

together which is shown as:

2.2.1. Prestructuralist Approach

Prestructuralist diachronic semantics ruled the scene of research in word meaning

around 1870 to 1930. However, the existence of lexical semantics before 1870

cannot be denied but it was only during this period that research in word meanings

were becoming more and more popular. This approach also known as historical-

philological semantics, spans from 1830 to 1930, was a historical approach to

lexical semantics, which placed an accentuation on the dynamic qualities and

psychological aspects of individual lexical meaning. But it was only during the

19th century that word meaning researches had a place as a distinct sub discipline

in Linguistics and secured a prominent place for itself. Some of the early

promoters of Prestructuralist semantics were Brèal, Paul, Darmesteter, Nyrop,

Camoy and Stem (Nerlich, 1992).

The three features present in this approach can be categorised first, in line

with the 19th century linguistics‟ nature, that the orientation is diachronic where

the semantics was researched based on change in meaning. Secondly, it was

studied on change of word meaning where the orientation was predominantly

semasiological rather than onomasiological or grammatical and the third feature

was predominantly psychological thus focusing on the conception of meaning.

Even Lexical meanings were considered to be psychological thoughts or ideas and

44
they believed that the change in meaning was explained due to psychological

processes.

2.2.2. Structuralist Approach

Trier (1931) is recognised as the individual who promoted Structuralist Semantics

in the beginning but the work of Weisgerber (1927) who, clearly influenced by De

Saussure, was the one who showcased the first major descriptive work in

Structural Semantics. He criticized the Pre-Structuralist approach precisely on

three characteristic points. At the onset, he mentioned that meaning should not be

studied based on atomism but it should be concerned with structural semantics.

Secondly, he pointed out that the study should be synchronic and not diachronic

and finally, the study of linguistic meaning should be independent and it should

proceed the linguistic way.

Linguistic semantics should deal with the linguistic structures directly

since the linguistic sign‟s meaning is determined by the position it occupies

inspite of the way it would be present in an individual‟s mind. The methodology

of Linguistic Semantics will be independent also because the subject matter of

Semantics consists of independent linguistic phenomena.

Structuralist approach, which began in the 1930s, takes a synchronic point

of view, seeing meaning and its structure as a self-ruling framework or an

autonomous system. Its primary mechanisms incorporate lexical field theory,

componential analysis, and relational semantics.

45
2.2.3. Neostructuralist Approach

Neostructuralist semantics started in the 1990s, taking its inceptions in

structuralist semantics, and afterward taking after its own particular extraordinary

course. The decompositional approach incorporates Natural Semantic

Metalanguage, Conceptual Semantics, Generative Lexicon, and Two-Level

Semantics, while the relational approach incorporates WordNet, meaning-text

theory / lexical function, and distributional corpus examination.

The Meaning-Text Theory of Melcuk (1996) and the Natural Semantic

Metalanguage (NSM) theory of Wierzbicka (1972) are also referred to as “neo-

structuralist” approaches (Geeraerts 2006). Both theories assume a compositional

view of meaning though they vary in terms of lexical decomposition. The NSM is

based on cross-linguistic data from diverse language families. Major works on

NSM are: Goddard and Wierzbicka (1994), and Wierzbicka and Harkins (2001).

WordNet (Fellbuam 1998) is a work based on relational conception of lexical

structure.

2.2.4. Generativist Approach

Generative semantics is the result of the merging of structuralist analysis with

Generative Grammar, as executed and performed by Katz and Fodor from the late

1960s to the 1970s.Generativist studies began during the 1960s and in (1963)

model of lexical-semantic description was started, Katz and Fodor, which was

later developed by Katz (1972). Katzian model combines the culmination of

structuralist approach into generative grammar and takes over Chomsky‟s

requirement that linguistic analyses can be formalized and Chomsky‟s mentalistic

self-conception. Katzian semantics brought together the three types of semantic


46
relations, that is, the paradigmatic similarity, syntagmatic restrictions on the

combination of words, as in selectional restrictions (such as that the direct object

of nose has to refer to something smelling), and the paradigmatic lexical relations

which are synonymy, antonymy, and hyponymy.

2.2.5. Neogenerativist Approach

Neogenerativist semantics began during the 1970s and it is part of the same phase

as the Generativist semantics which was initially started by Katz and Fodor (1963)

and later on developed by Katz in 1972. Neogenerativist approach is mainly

characterized by their semasiological orientation because it usually concerns itself

with the senses or the number or types of meanings that a lexical item can have

and the relationship that exists between these meanings.

Pustejovsky (1995) during the 1990‟s gave a new motion to the Katzian idea of a

formalized semantic representation by basing it on a logical instead of a featural

formalism. Also, he goes way beyond the Katzian approach by emphasizing the

need to build a lexicon that is genuinely generative - in the sense that it does not

just comprise of an enumeration of word senses, but includes the possibility of

formally deriving new readings from already stored ones. In this sense,

Pustejovsky's approach (which is closely linked with Jackendoff's) is a

'neogenerativist' one as it develops the Katzian ideal of a formal semantic

portrayal or representation by introducing semantic adaptability and flexibility

and a logical formalism.

47
2.2.6. Cognitive Approach

Cognitive Approach, which started in the 1980s as a reaction to the objective

world-view and truth-conditional semantics, considers conceptual structure as an

embodiment, the portrayal of meaning as being comprehensive and

encyclopaedic, the conception or the origin of meaning as conceptualization, and

the process of conceptualization as interpretation and construal. The primary

mechanisms of cognitive semantics incorporate prototype theory, conceptual

metaphor and metonymy, the Idealized Cognitive Model and frame semantics.

Attempts by mentalists led to a straightforward cognitive orientation, with

further development of lexical semantics, with works such as that of Lakoff

(1987), and Langacker (1987), and Lakoff and Johnson (1980). There was a

change in the shift of emphasis from lexical semantics to sentential semantics with

advancing of logical semantics. Thomason (1974) pioneered logical semantics.

Pustejovsky (1995) took Katzian semantics by introducing semantic flexibility

and a logical formalism emphasizing the necessity of building up a generative

lexicon that consists of an enumeration of word senses. Fillmore (1977)‟s Frame

theory proved a stimulating framework for the description of verbal meaning, both

theoretically and lexicographically.

Taylor (2003), Aitchison (2003), Violi (2001), Croft and Cruse (2004),

Ungerer and Schmid (2006), Evans and Green (2006), and Kristiansen, Achard,

Dirven, and Ruiz de Mendoza Ibanez (2006), Geeraerts and Cuyckens (2007), and

Evans, Bergen, and Zinken (2007) contributed to semantic research within the

framework of Cognitive Linguistics.

48
The introduction of computer technology into the field of Lexical

semantics also led to the formation of a new branch, Computational Semantics. In

early 1960s, machine-readable dictionaries and applications developed in Natural

Language Processing (NLP), mainly for machine translation. Electronic resources

which include large-scale corpora and lexical databases find a variety of NLP

applications in machine translation, text classification, word-sense

disambiguation, data or information extraction, data or information retrieval,

question answering, and text summarization.

2.3. Approaches to Lexical Semantics

2.3.1. One-level vs. two-level approaches

A major dividing line which separates semanticists is the question of whether a

distinction can be made between semantics and encyclopaedic knowledge. In a

similar way, the variety of 'raw' meanings (pertaining to encyclopaedic

knowledge) is virtually infinite, but only a limited number of these are truly

linguistic and interact systematically with other aspects of the linguistic system.

The vast detailed knowledge of the world, which speakers undoubtedly possess,

is, according to the dual-level view, a property, not of language elements, but of

concepts, which are strictly extra linguistic. Truly linguistic meaning elements are

of a much 'leaner' sort, and are (typically) thought of as (more) amenable to

formalization. One criterion suggested for recognizing 'linguistic' meaning is its

involvement with syntax.

49
On the other hand, most cognitive linguists would take the view that all

meaning is conceptual (one-level approach).

2.3.2. Monosemic vs. polysemic approaches

The point in question regarding the classification between the monosemic

approach and the polysemic number of implications is what should be related to a

word. There is no debate about obvious instances of homonymy, similar to that of

bear (the animal) and bear (tolerate) where there is no possible method for getting

one significance from the other. The debate fixates on groups of related sense

normal for polysemy. The monosemic view is that the least number of senses

must be recognized separately in the (ideal) lexicon of a language, and how many

of them derived from these. If a reading of a word is somehow a motivated

extension, then only one must be recorded and the other should be left to the

operation of the lexical rules according to the monosemic view.

The polysemic approach rejects the hypothesis that the extension of a

motivated sense of the word should not be recorded in the lexicon. The

fundamental reason for this is that the lexical rules only specify potential meaning

extensions, only some of which become conventional and incorporated in the

lexicon.

2.3.3. The componential approach

One of the first and most persistent and comprehensive way to deal with the

meaning of words or lexical items is to think of the meaning of a word as

constructed from small units, more elementary of meaning which is somewhat

similar to the analogy of the atomic structure issue. These "semantic atoms" are
50
known as semes, semantic features, semantic components, semantic markers, and

semantic primes.

2.3.4. 'Holist' approaches

All componentialists believe that the meaning of a word can, in a sense that it can

be useful, be finely specified, isolated from the meanings of other words in the

language. Among the philosophers of language, this is known as the localistic

view. The contradictory position is the holistic view that believes that the meaning

of a word cannot be known independently of the meanings of all the other words

in a language. There are several versions of holistic and the two versions are of

Haas and Lyons.

2.3.5. Conceptual approaches

Conceptual approaches are approaches at the individual level or otherwise known

as single-level approaches that identify the meaning of a word or lexical items or

at least a significant part of it, with the concept or concepts that gives access to the

cognitive system. Among the cognitive linguists, the prototype concept model

structure is an important contribution.

2.4. Sense Relations and Word Sense Disambiguation

Sense relations, according to Cruse (1986, 2000 and 2011), Palmer (1995) and

Saeed (1997), is further established as paradigmatic relations and syntagmatic

relations and word sense disambiguation shows the ambiguity existing in a

language and this is shown by two sense relations i.e. homonymy and polysemy.

51
2.4.1. Paradigmatic Relations

Paradigmatic relations, for the most part reflect the way infinitely and

continuously varied experienced reality is apprehended and controlled through

being categorized, sub-categorized and grade along specific dimensions of

variations. They represent systems of choices a speaker faces when encoding his

message (Cruse 1986, p.86). These relations are choice relationships where it does

not have an order. It is a vertical relationship unlike Syntagmatic relationships.

Paradigmatic relations can be further sub-divided into:

(i) Paradigmatic relations of identity and inclusion

(ii) Paradigmatic relations of exclusion and opposition

There are different types of lexical relations that can be related in terms of their

lexical identities.

2.4.1.1 Paradigmatic relations of identity and inclusion

Synonymy, Meronymy and Hyponymy are types of lexical relations which fall

under Paradigmatic relations of identity and inclusion.

2.4.1.1.1. Synonymy

Synonymy is a word that sounds different but has the same or almost the same

meaning. Palmer (1976, p. 59) says, “Synonymy is used to signify „similarity in

meaning‟."However, it can be argued that no actual synonyms or no two words

have exactly the same meaning. Cruse (1986) termed synonymy as the lexical

relation which parallels identity in the membership of two classes. He also

discusses the various types of synonymy such as absolute synonymy,

52
propositional synonymy and near synonymy. The types of synonyms discussed by

Cruse (2011, p. 142-143) are:

i. Absolute synonymy: Absolute synonymy denotes that a word can be

interchanged or substituted with another word and retain the same

meaning in a sentence. It is not a relation between meanings, but between

word forms. Absolute synonyms are words which are mutually

substitutable in all contexts without change of normality. The following

will illustrate the difficulty of finding the unconditional pairs of absolute

synonyms (+) as in sentence (1) and (3) indicate “relatively more normal”

and (2) and (4) indicate “relatively less normal” and the symbol for this is

(-).

a. calm: placid

(1) She was quite calm just a few minutes ago. (+)

(2) She was quite placid just a few minutes ago. (-)

b. almost: nearly

(3) She looks almost Chinese. (+)

(4) She looks nearly Chinese. (-)

ii. Propositional synonymy: Propositional synonymy can be defined in terms

of entailment. Two sentences which differ only in that one has one

member of a pair of propositional synonyms where the other has the other

member of the pair are mutually entailing. For example, John bought a

violin entails and is entailed by John bought a fiddle; I heard him tuning a

fiddle entails and is entailed by I heard him tuning his violin; She is going

53
to play a violin concerto entails and is entailed by She is going to play a

fiddle concerto.

iii. Near synonymy: The borderline between propositional synonymy and near

synonymy is at least in principle clear; even if decision may be difficult in

particular cases. Two points should be made clear at the outset- the first is

that language users do have intuitions as to which pairs of words are

synonyms and which are not and secondly there is a scale of semantic

distance, and that synonyms are words whose meanings are relatively

close. Example:

entity process

living thing object

animal plant

animal bird

dog cat

spaniel poodle

etc.

2.4.1.1.2. Meronymy

Meronymy is not just a unique relationship, but a different set of relationships

between the part and the whole. A meronym is a word that indicates a constituent

part or a member of something. Meronym denotes a word or other element from

which other elements constitute a set or a whole.

An important and interesting type of semantic relation, expressed in

language, „is the relation between the parts of things and the wholes which they

comprise‟. Relationships which are expressed either with the term part,or which

54
by their position in a part-whole expression signal part, are considered to be

meronymic and to „structure semantic space in a hierarchical fashion‟ (Winston et

al 1987, p.417-418).

Cruse (2011) termed meronymy as the conceptual reflex of the part-whole

relation between individual referents. He also mentioned that meronymy shows

interesting parallels with hyponymy and they must not be confused, for instance, a

dog is not a part of an animal and a finger is not a kind of hand. In both cases,

there is inclusion in different direction according to whether one takes an

extensional or intentional view. Examples of meronymy are: hand: finger, teapot:

spout, wheel: spoke, car: engine, telescope: lens, tree: branch, and so on (Cruse

2011, p.137). The example can be interpreted as, the finger is a part of the hand,

the spout is a part of a teapot, the spoke is a part of the wheel, the engine is a part

of the car, the lens is a part of the telescope and the branch is a part of the tree.

Rose Plant
is-a

has-a has-a

thorn leaf
Figure 6: Meronymy of „plant‟

In Figure 6, it can be seen that the meronymy or part-whole relationship of a plant

in which the arrows show the “has-a” meronomic relationship where a plant has a

leaf and a rose has a thorn. This shows that meronymy is a kind of taxonomic

relationship.

55
2.4.1.1.3. Hyponymy

Hyponymy is a relationship between two words where the meaning of a word

includes the meaning of the other word. It is not limited to objects, abstract or

substantial concepts. It can be identified in many other areas of the lexicon.

Saeed (2003, p.69) says, "Hyponymy is a connection of consideration. A

hyponym incorporates the significance of a more broad word". Palmer (1976,

p.76) states that "hyponym includes us in the country of consideration in the

sense".

Hyponymy is the lexical relation corresponding to the inclusion of one

class in another (Cruse 1986, p.88). Saeed (1997, p.68) defines hyponymy as “a

relation of inclusion”. The lexical relation corresponding to the inclusion of one

class in another is hyponymy.

A hyponym includes the meaning of a more general word, this is

exemplified below:

1) dog and cat are hyponyms of animal

2) sister and mother are hyponyms of woman

The more general term is called the superordinate or hypernym. Much of

the vocabulary is linked by such systems of inclusion, and the resulting semantic

networks form the hierarchical taxonomies (Saeed 1998, p.68).

56
Some examples are given below:

X is a kind of/type of/sort of Y


The relation of hyponymy is:

bird

crow hawk duck sparrow

kestrel sparrowhawk ........

Figure 5: Hypernym-Hyponym relation

In Figure 5, kestrel is the hyponym of hawk and hawk is the hyponym of bird and

hawk and crow are co-hyponyms of bird (Saeed 1997, p.69).

Palmer (1981) explains that hyponymy involves the notion of inclusion in

the sense that tulip and rose are included in flower, and lion and elephant in

mammal. Similarly scarlet is included in red. Inclusion is thus a matter of class

membership. The „upper‟ term is the superordinate and the „lower‟ term the

hyponym.

Crystal (1980, p. 233) defined hyponymy as “a term used in semantics as

part of the study of the sense relations which relate lexical items. Hyponymy is

the relationship that holds between specific and general lexical items, such that

the former is „included‟ in the latter (i.e. „is a hyponym of‟ the latter). For

example, cat is a hyponym of animal, flute of instrument, chair of furniture, and

57
so on. In each case, there is a superordinate term (sometimes called a hypernym

or hyperonym), with reference to which the subordinate term can be defined, as

is the usual practice in dictionary definitions („a cat is a type of animal . . .‟). The

set of terms which are hyponyms of the same superordinate term are co-

hyponyms, e.g. flute, clarinet, trumpet. A term which is a hyponym of itself, in

that the same lexical item can operate at both superordinate and subordinate

levels, is an autohyponym: for example, cow contrasts with horse, at one level,

but at a lower level it contrasts with bull (in effect, „a cow is a kind of cow‟).

2.4.1.2. Paradigmatic relations of exclusion and opposition

Paradigmatic relations of exclusion and opposition include opposites, antonymy,

complementaries, incompatibility and co-meronymy. They are discussed below:

2.4.1.2.1.Opposites

Opposites are words that are in a binary relationship that is intrinsically

incompatible as the opposite pairs such as long: short and lead: follow. A lexical

opposite relationship is an association between two lexical units with core

meanings opposed in some contexts. Also, not all words have an opposite.

Opposites denotes that a word has another word or lexical item which is opposite

in meaning and shows contrast to the word.

In many languages, including English, the most commonly used opposites

tend to be morphologically unrelated (good: bad, high: low, beautiful: ugly, big:

small, old: young). But these are outnumbered in the vocabulary by such

morphologically related pairs as married: unmarried, friendly: unfriendly, formal:

informal, legitimate: illegitimate, etc. (Lyons1977, 275). Cruse (2000) describes

58
opposites as the only sense relations which receives direct lexical recognition in

everyday language. Thus, opposites are words that lie in an inherently

incompatible binary relationship as in the opposite pairs male: female, long: short

etc.

According to Palmer (1976, p.80-81) and Cruse (2011, p.153-154), there

are different ways to identify opposites. They are:

i. Binary: By the definition given above, binary opposites are incompatibles

where x is long entails x is short. However, they are not just

incompatibles. Since there is nothing in the notion of incompatibility itself

which limits the number of terms in a set of incompatibles; but there can

only be two members of a set of opposites. Hence, binary is a prerequisite.

Lyons (1977, p. 271) declares that „binary opposition is one of the most

important principles governing the structure of languages‟, commenting on „what

appears to be the human tendency to categorize experience in terms of

dichotomous contrasts‟ (1977, p. 277). Murphy (2003, p. 169) concurs, calling

binary opposition „the archetypal lexical semantic relation‟.

ii. Inherent binary: Inherent binary can be considered as a pro-typical feature

of oppositeness. Since, one must, however, distinguish between accidental

and inherent binarity. There are, for instance, only two classes of buses on

the „-decker‟ dimension, namely single-deckers and double-deckers. They

may well be reasons, to do with stability and the height of bridges and so

forth, for the absence of triple-deckers, but there is no logical reason.

Likewise, there are only two sources of heat for cooking in the average

suburban kitchen, namely gas and electricity. But there is no more than the

59
feeblest hint of oppositeness about single-decker: double-decker, gas:

electricity or tea: coffee. That is because the binarity is accidental and

pragmatic, rather than inherent. By contrast, the possibilities of movement

along a linear axis are logically limited to two: the binarity of the pair up:

down is thus ineluctable, and they form a satisfactory pair of opposites.

iii. Patency: Inherent binarity is necessary for a prototypical pair of opposites,

but is not sufficient. Take the case of Monday: Wednesday. The time

dimension is linear, and Monday and Wednesday are situated in opposite

directions from Tuesday. Yet they do not feel like opposites. It seems that

in the case of Monday and Wednesday, their location in opposite

directions along the time axis relative to Tuesday and hence the binarity of

their relationship is not encoded in their meanings, but has to be inferred,

whereas the directionality of yesterday and tomorrow relative to today is a

salient part of their meaning. In Cruse (1986) this difference was referred

to as latent as opposed to patent binarity. The patency of the binary

relation can thus be added to the list of prototypical features of opposites.

The kinds of opposites are shown in Figure 8.

Opposites

Complementaries Antonyms
sickly-healthy
beautiful-ugly polar overlapping equipollent
thin-fat nice-mean hot-cold

Figure 8: Kinds of Opposites

60
2.4.1.2.2. Antonyms

Antonyms are those words which denote the direct or a word that is close to the

opposite meaning of another word or words. While using antonyms, the opposite

reaction or negative or positive reaction to these words can be expected. Justeson

and Katz (1992, p. 176) assert that „antonymy is not only a semantic but also a

lexical relation, specific to words rather than concepts‟. Saeed (1997) describes

antonyms as words which are opposite in meaning. Jones (2002, p.1-2) defines

„antonymy‟ as the term to mean all opposites. Cruse (2006) describes antonyms as

a variety of lexical opposites.

Cruse (1986, p. 208) categorised the sub-classes of antonyms as follows:

i. Equipollent antonyms: Equipollent antonyms refer to distinctly subjective

sensations or emotions for example, hot: cold, happy: sad, or evaluations

based on subjective reactions, rather than on „objective‟ standards such as

nice: nasty, pleasant: unpleasant.

ii. Polar antonyms: Polar antonyms are typically evaluatively neutral, and

objectively descriptive. In the majority of cases, the underlying scaled

property can be measured in conventional units, such as inches, grams, or

miles per hour. For example, long: short.

iii. Overlapping antonyms: Overlapping antonyms refer to the antonyms that

have an evaluative polarity as part of their meaning: one term is

commendatory, for example, good, pretty, polite, kind, clean, safe, honest,

and the other deprecatory such as bad, plain, rude, cruel, dirty, dangerous,

and dishonest.

61
Sub-classes of antonyms according to Sapir (1944), Lyons (1977), Saeed

(1997) and Lyons are:

iv. Gradable antonyms: This is a relationship between opposites where the

positive of one term does not necessarily imply the negative of the other,

e.g. rich/poor, fast/slow, young/old, beautiful/ugly. This relation is

typically associated with adjectives, and has two major identifying

characteristics: firstly, there are usually intermediate terms so that between

the gradable antonyms hot and cold we can find:

hot (warm tepid cool) cold

Secondly, the terms are usually, relative, so a thick pencil is

likely to be thinner than a thin girl. A third characteristic is that in some

pairs one term is more basic and common. Other examples of gradable

antonyms are: tall/short, clever/stupid, near/far, interesting/boring.

v. Converse: Converses are terms which describe a relation between two

entities from alternative viewpoints, as in pairs: own/belong to,

above/below, employer/employee. Thus, in the sentence Alan owns this

book it can be automatically deduced that The book belongs to Alan.

vi. Reverse: The characteristic reverse relation is between terms describing

movement, where one term describes movement in one direction, →, and

the other the same movement in the opposite direction, ←; for example the

terms push and pull on a swing door, which instructs in which direction to

apply force. Other such pairs are come/go, go/return, ascend/descend.

When describing motion the following can be called reverses: (go)

up/down, (go) in/out, (turn) left/right.

62
By extension, the term is also applied to any process which can be

reversed: so other reverses are inflate/deflate, expand/contract, fill/empty

or knit/unravel.

2.4.1.2.3. Complementaries

Complementaries are denoted and shown in pairs. These pairs show the opposite

effect or the opposite circumstances to each other. Palmer (1976, p.80) explains

that “complementary pairs are predicates which come in pairs and between them

exhaust all the relevant possibilities”. If one predicate is applicable then the other

cannot be and vice versa in these pairs. Consequently, complementary pairs show

or denote the relation between words in such a way that the positive of a word or

lexical unit implies the negative of the other. Cruse (1986, 2011) expresses

complementaries as those that constitute a very basic form of oppositeness and

display inherent binarity in perhaps its purest form. The following pairs represent

typical complementaries: dead: alive, true: false, obey: disobey, inside: outside,

continue (V-ing): stop (V-ing), possible: impossible, stationary: moving, male:

female. The essence of complementaries is that between them they exhaustively

divide some conceptual domain into two mutually exclusive compartments, so

that what does not fall into one of the compartments must necessarily fall into the

other. The point about complementaries is that, once a decision has been reached

regarding one term, in all relevant circumstances, a decision has effectively been

made regarding the other term, too.

63
2.4.1.2.4. Incompatibility

Incompatibility refers to the fact that one word in an opposite pair entails that it is

not the other pair member. The entailment that one does not belong to the other is

a characteristic feature of incompatibility. If A says that it is not B, in

incompatibility, it will be shown as A entails B. Thus, two lexical items X and Y

are incompatible if a sentence of the form A is f(X) can be found which entails a

parallel sentence of the form A is not f(Y). It‟s a cat entails It‟s not a dog, It‟s a

carnation entails It‟s not a rose, John is the one who is waling entails John is the

one who is running and John is near the building entails John is not in the

building. Thus, Cruse (1986) describes incompatibility as the sense relation which

is analogous to the relation between classes with no members in common.

Incompatibility is sometimes defined in terms of a contrary relation

between sentences. For instance, X is a dog and X is a cat cannot both be true, but

can both be false, similarly John is walking and John is running. Or, in terms of

entailment, X is a dog entails but is not entailed by X is not a cat, similarly, John

is walking entails but is not entailed by John is not running (Cruse 2011, p.152).

For example, He is a boy entails He is not a girl; It is a daffodil entails It is

not a lily; Sam is the one who is jogging entails Sam is not the one who is

running.

2.4.1.2.5. Co-Meronymy

Co-meronyms are the parts of an object which has a whole. Cruse (1986)

describes co-meronymy as the semantic relation between a lexical item denoting a

part and that part, in turn, denotes a corresponding whole. It shows the relation of

64
exclusion between parts, for instance, upper arm-lower arm. Co-meronymy is the

relation between lexical items designating sister parts. For example, hand has

fingers, hand is a meronym and fingers denote sister parts which are termed as co-

meronym.

2.4.2 Syntagmatic Relations

Syntagmatic relations are those relations which is non-branching in nature. It has

a direct horizontal relationship between the lexical items. Syntagmatic relations

have nothing to do with meaning. This is the lexical relationship that the word

keeps (collocation) and grammatical models or patterns in which it occurs

(colligation).Saussure (1959) explains that syntagmatic relationship are those

relationships where words are chained together to form a sentence and hence he

called syntagmatic relations as chained relations. "In the syntagm a term acquires

its value because it stands in opposition to everything that precedes or follows or

to both” (Saussure 1959, p.123).

Syntagmatic aspects of lexical meaning serves discourse cohesion, adding

necessary informational redundancy to the message, at the same time controlling

the semantic contribution of individual utterance elements through

disambiguation, for instance, or by signalling alternative e.g. figurative –

strategies of interpretation (Cruse 1986, p.86). It has a horizontal relationship

therefore the combination supported by linear hierarchies is known as

Syntagmatic. It is always in a fixed order.

65
2.4.3. Word Sense Disambiguation (WSD)

Word Sense Disambiguation (WSD) is a process through which the meaning of a

word which is used in a sentence can be identified especially when the word has

different meanings. Edmonds and Agirre (2008) explain that in natural language

processing, word sense disambiguation (WSD) is the problem of determining

which "sense" (meaning) of a word is activated by the use of the word in a

particular context, a process which appears to be largely unconscious in people.

Kilgraff (1999, p. 1), defines word sense disambiguation as a tool that

“describes the various kinds of ways in which a word‟s meaning can deviate from

its core meaning”.

Hussain and Beg (2013), also elucidates that in natural language

processing (NLP), WSD is defined as the task of assigning the appropriate

meaning (sense) to a given word in a text or discourse. Natural language is

ambiguous, so that many words can be interpreted in multiple ways depending on

the context in which they occur. The computational identification of meaning for

words in context is called word sense disambiguation (WSD).Word sense

ambiguity means a single word or sentence is interpreted differently by different

users.

Thus, it is seen that word sense disambiguation (WSD) is the task of

assigning meaning to an ambiguous word given the context in which it occurs. It

therefore, serves as an intermediate step for many applications, such as automatic

translation, information retrieval, hyperlink navigation, content analysis and

thematic speech processing. Therefore, it has been a central theme since the early

days of computational studies of natural language. WSD requires a set of

66
meanings for each word to disambiguate and a means for choosing the right of

that set. This is a common practice of using the word without meaning distinctions

electronic dictionary such as the Fieldworks software used in chapter 6.

Lexical Ambiguity

Lexical ambiguity means that ambiguity is present in the lexical item or the word

itself. The word can have double or more meanings and the use of this word might

result in different interpretations according to the situation. Harford and Heasley

(1983, p. 128) states, “lexical ambiguity is resulting the ambiguity of word”. It

means that the word in a sentence has more than one sense. Lexical ambiguity of

the expression is resulted from a polysemous word, e.g. a word that has more than

one meaning. Palmer (1976, p. 67) explains that “polysemy is one word with

several meanings”. It can be disambiguated by giving further information.

Examples:

a. We can serve.

There are two senses from this sentence, they are:

- to be able to; to have the ability to serve somebody and earn money

- to serve; to have the ability to play tennis, table tennis or badminton

b. The trunks are here.

There are two senses from this sentence, they are:

- a box usually made of steel where a person can store clothes, objects etc

- an elephant‟s trunk

67
Structural Ambiguity

Structural ambiguity is usually in the structure of a sentence where the

interchange or the change in the structure of the sentence will lead to different

interpretations or meaning. Harford and Heasley (1983, p. 128) elucidates that

“structural ambiguity happens because its words relate to each other in different

ways, even though none of the individual word is ambiguous”. It means that an

expression is ambiguous in structure if it is resulted from the way the constituents

are grouped into larger syntactic unit.

Examples:

a. We need more intelligent teachers

- We need teachers that are more intelligent.

- We need more teachers that are intelligent.

b. Visiting the sick (people) can be dangerous.

- The sick (people) who are visiting can be dangerous.

- To visit the sick (people) can be dangerous.

These sentences are structurally ambiguous. The possible constituents and

their meaning structures can be noted. However, there may be some structural

ambiguities that cannot disambiguate in the same way. Examples:

a. They hated shouting at the maid

- They hated shouting the maid

- They hated the maid‟s shouting

b. The fish is ready to eat

- The fish is ready to eat

- The fish is ready to be eaten

68
2.4.3.1 Homonymy

Homonymy is a term used to denote those words which have same pronunciation

and spelling but different meanings or same pronunciation but different spellings

and different meanings. Palmer (1976, p. 67) states that homonymy is when there

are several words with same shape. In other words, homonyms are different words

which are pronounced the same, but different meaning. For Saeed (1997, p. 63),

homonyms are unrelated senses of the same phonological or orthographic word.

Homonymy is an ambiguous word whose different senses are far apart from each

other and not obviously related to each other.

Examples of homonymy are:

1. bank1 – a shorebank2 – an institution for receiving, lending, exchanging,

and safeguarding money.

2. ball1– a sphere; any spherical body;ball2 – a large dancing party.

Types of Homonymy

There are two types of homonyms, namely, homographs and homophones.

i. Homographs: Homographs are those words which have the same spelling,

the same pronunciation but different meanings such as, lie as in to tell a lie

and lie as in to lie down in bed. Saeed (1997) explains homographs as

senses of the same written word. Examples of homographs are fly, as in an

insect and fly, as in to spread one‟s wings and to fly in the air; lie, as in to

69
tell a lie and lie, as in to sleep or to lie down in bed; pen, as in stationary,

with which we can write and pen, as in a shelter for pigs.

ii. Homophones: Homophones are those words which have the same

pronunciation but different spelling and also have different meanings.

Some examples of homophones are die, dye; four, for and flower, flour.

Saeed (1997) describes homophones as senses of the spoken word.

Bussman (1998) elucidates homophones as a type of lexical ambiguity in

which two or more expressions have an identical pronunciation but

different spellings and meanings, e.g. pray vs prey and course vs coarse.

2.4.3.2 Polysemy

Polysemy can be described as those words which has several meanings for the

same word. Sameness of meaning is not very easy to deal with but there seems

nothing inherently difficult about difference of meaning. Not only different words

have different meanings; it is also the case that the same word may have a set of

different meanings. This is called polysemy (Palmer 1976, p. 65). Saeed (1997)

describes polysemy as a word that has multiple senses of the same phonological

word and is related to each other. A word with multiple senses that are related to

each other unlike homonymous sense which are given separate entries by

lexicographers in dictionaries, polysemous sense are listed under the same lexical

entry. Nerlich and Clarke (1997, p. 378) defines it as „polysemy as the always

synchronic pattern of meanings surrounding a word, which is itself the ever

changing result of semantic change‟.

70
Polysemy is one where a word has several closely related senses. In other

words, a native speaker of the language has clear intuitions that the different

senses are related to each other in some way. Example, mouth where it can be

mouth of a river and mouth, the mouth of an animal is a case of polysemy.

A polysemous word has a direct sense from which other senses can, in

semantic analysis, be derived by assuming that they are characterised by some

added connotation, or by the sense being figurative, or similarly by transference

and specialisation (Zgusta 1971, p. 61).

Many linguists and lexicographers all agree and define “polysemy as a

case where the same word has two or more different, but conceptually related

meanings or variants of the same meaning (Lyons 1977, p. 552; Palmer 1981, p.

101; Hurford and Heasley 1983, p. 123; Saeed 1997, p. 64; Zgusta 1971, p. 61;

Jackson 1988, p. 5; Landau 1984, p. 100)”.

2.5. Non-Literal Meanings

Concepts which are related to other types of meaning, namely, metonymy,

metaphor, simile, word pair, riddle, idiom and proverb are discussed in this sub-

section. Cruse (2011) explains that most people or adults, at least, are aware that if

someone says Jane‟s eyes nearly popped out of her head, a literal truth has not

been expressed, Jane‟s eyes were not, as a matter of fact, on the point of being

projected from her head; the message is rather that Jane was very surprised. At the

everyday level, the contrast between literal and figurative use does not seem

problematical.
71
2.5.1. Metonymy

Metonymy is a term used to describe the replacement of a habit, an activity by

substituting a figurative meaning in terms of a lexical item in place of the actual

activity or habit or situation. According to the classical definition, metonymy is 'a

figure in which one word is substituted for another on the basis of some material,

causal, or conceptual relation (Preminger & Brogan (eds.) 1993).

Metonymy is a kind of non-literal language in which one entity is used to

refer to another entity that is associated with it in some way. In other words,

metonymic concepts „allow us to conceptualize one thing by means of its relation

to something else (Lyons, 1980, p. 39).

For Crystal (2008), a metonymy is a term used in semantics and stylistics

referring to a figure of speech in which the name of an attribute of an entity is

used in place of the entity itself. For example, People are using metonyms when

they talk about the bottle for the drinking of alcohol.

2.5.2. Metaphor

Metaphors are used to express everyday life in terms of having both literal as well

as meaning transferred to the expression. Metaphors are conceptual (mental)

operations reflected in human language that enable speakers to structure and

construe abstract areas of knowledge and experience in more concrete experiential

terms (Hurford, Heasley and Smith, 2007, p.331).

Lakoff and Johnson (2003) observe that metaphor is pervasive in everyday

life, not just in language, but also in thought, and that metaphorical thought is

72
normal and ubiquitous in our mental life, both conscious and unconscious. They

further note that fundamentally metaphors are mechanisms of the mind and that

our conceptual system is metaphorical in nature.

Metaphor is defined as a case where a word appears to have both a literal

and a transferred meaning (Jackson and Amvela 2000, p. 59-60).

Lakoff and Johnson (2003) differentiate three types of conceptual metaphor:

structural, orientational, and ontological.

1. The structural metaphors are metaphors that involve the structuring of one

kind of experience or activity in terms of another kind of experience or

activity. Structural metaphors are abstract metaphorical systems in which

an entire (typically abstract) complex mental concept is structured in terms

of some other (usually more concrete) concept (Lakoff and Johndon 1980,

p.61, 197).

2. Orientational metaphors are those that organize a whole system of

concepts in terms of physical orientation.For example, happiness is up

(boosted or high spirits, raise morale) while sadness is down (depressed,

down in the dumps, feeling low). Similarly, health, consciousness, having

control, more, good, virtue, and rational are all up, while sickness,

unconsciousness, being controlled, less, bad, depravity, and emotional

thinking are all generally down. Not all orientational metaphors are up-

down. Future andpast are ahead and behind though which is which

depends on the culture.

73
3. Ontological metaphors give incorporeal things a sense of substance so we

can refer to an abstract concept in terms of quantity (a lot of patience),

character (brutality of war), agency (love drove him mad), directionality

(prices are rising), etc.We also view events, actions, activities and states as

containers--such as getting into or out of trouble, being in a race, getting

satisfaction out of doing something. Ontological metaphors can be

extended by giving the object or substance certain characteristics. A

common example is thinking of something as a person or similar agent.

Metaphor is a way of understanding a concept and according to Lakoff and

Johnson (2003), meaning and truth depend on understanding. Truth is not

objective, but depends on context; it relies on a human thinker. Thus, metaphors

structure what we perceive as truth is taking cultural specificity and individual

bias into account.

2.5.3. Simile

Simile can be expressed as a comparison between two things, situation between a

non-living and living being. Cruse (2011) describes simile as an explicit

comparison between two different conceptions (entities, properties, actions, etc.)

Cruse (2006), again explains that a simile involves an explicit comparison

between two things or actions. The majority of similes include the word „like‟.

For example, You are behaving like a spoilt child, Their house is like a

renaissance palace. „As if‟ is also quite frequent. For example, He treats her as if

she were a delicate piece of porcelain...True similes are considered by many to be

a type of figurative language.

74
For Cuddon (2013) a simile is a figure of speech in which one thing is

likened to another, in such away as to clarify and enhance an image. It is an

explicit comparison as opposed to the metaphor where the comparison is implicit

recognizable by the use of the words 'like' or 'as'.

2.5.4 Word Pair

Word pair is a set of two things used together or regarded as a unit.An interesting

feature in the Garo language is the vast number of pair words called katta-jikse, or

ku-jikse which are very frequently used in the language. A ku-jikse comprises of

two words which complement each other semantically, but each word can also be

used independently, thereby distinguishing them from reduplicated words. Katta-

jikse are words used in pairs and often made to rhyme, so that they not only make

the meaning of a sentence clearer, but also embellish them by making them

rhyme, so that they sound good to the ear as well and thereby, enriching the word

as well as the sentence further. They are used very often in ordinary speech and

form a rich part of the vocabulary. As rightly said by Ingty (2008, p. 249), word

pairs occur very frequently in the spoken language and in the written form, though

they are not used as often, but when they do so, they are used with great effect.

2.5.5. Riddle

Riddles are used in order to develop curiosity among young individuals as well as

among children and the elders as well. It is usually used to reduce the time of

waiting or leisure time and it is in the form of a question, a statement, a puzzle or

a phrase. Literary Devices Eds. (2013) describes riddle as a question, a puzzle, a

phrase or a statement devised to get unexpected or clever answers. It is a folklore

genre as well as rhetorical device, having often veiled or double meanings.

75
Cuddon (2013) describes riddle as an ancient and universal form of literature, in

its commonest form it consists of a puzzle question: the equivalent of a

conundrum or an enigma.

2.5.6. Idiom

Idioms are used usually to denote double meaning or if the person does not want

another individual to know what the person is hinting. One has to think before

deciphering the real meaning behind an idiom. Hurford, Heasley and Smith

(2007) describe idioms as multi-word phrases whose overall meanings are

idiosyncratic and largely unpredictable, reflecting speaker meanings that are not

derivable by combining the literal senses of the individual words in each phrase

according to the regular semantic rules of the language (p.328).

Cruse (1986) explains that it has long been recognised that expressions

such as to pull someone‟s leg, to have a bee in one‟s bonnet, to kick the bucket,

to cook someone‟s goose, to be off one‟s rocker, round the bend, up the creek, etc.

are semantically peculiar. They are usually described as idioms.

2.5.7. Proverb

Proverbs are usually used to give advice to another person or warn someone so

that they do not commit a mistake. It can indicate a real life event that has taken

place or experiences faced by the individual who tells the proverb. Cruse (2011)

describes proverb as a specific event which represents/applies metaphorically to

other (more abstract) events or states with a similar image (image-schematic)

structure, for instance, a nod is as good as a wink to a blind bat.

Cruse (2000) describes a proverb as a specific event or state of affairs

which is applicable metaphorically to a range of different events or states of

76
affairs provided they have the same or sufficiently similar image-schematic

structure.

For Cuddon (2013), a proverb is a short pithy saying which embodies a

general truth. It is related in form and content to the maxim and the aphorism.

Common to most nations and peoples; it is a form of expression of great antiquity.

Many writers have made use of them. The best-known collection is the Book of

Proverbs which follows the Psalms in the Old Testament. Some examples of

proverbs include: Send a fool to close the shutters and he‟ll close them all over the

town (Yiddish); we cannot step twice into the same river (Classical Greek); when

you want a drink of milk you don‟t buy the cow (Cretan). A fine collection of

English proverbs is the Oxford Dictionary of English Proverbs (1935).

2.6. Field Semantics

The concept of lexical hierarchies and the significance of the field semantics are

highlighted in this sub-section.

The vocabulary of a language is not just a collection of words scattered at

random throughout the mental landscape. It is at least partly structured, and at

various levels. Linguistic structures in the lexicon are defined linguistically-those

which we shall be concerned with here are defined semantically, in terms of

meaning relations. Linguistic structures in the lexicon may have a phonological,

grammatical, or semantic basis. Obvious examples of grammatical structuring are

word classes which means grouping of words according to their syntactic

properties and word families which are sets of words derived from a common root

(Cruse 2000, p. 179).


77
2.6.1. Hierarchies-Taxonymy and Meronymy

The presence of hierarchies denotes that there is branching or classification

among a particular group of lexical items and in order to explain the parts better,

hierarchies are drawn. Cruse (2000) explains lexical hierarchy as a grouping of

lexical items whose meaning is related in a way that can be represented by means

of a „tree diagram‟. These two types of hierarchies which are taxonomic

hierarchies and meronymic hierarchies are described in detail.

Taxonymy

Taxonymy is a branching hierarchy where the „kind-of‟ of similar type of lexical

items are shown. Cruse (2000) describes taxonomic hierarchies as essentially

classificatory systems and explains that a well-formed taxonymy offers an orderly

and efficient set of categories at different levels of specificity. Taxonymy is the

relation of dominance in taxonomic hierarchies and the relation of difference is

co-taxonymy. In a taxonomic hierarchy, the „kind-of‟ or „is a „relationship is

depicted.

Cruse (2006) explains that the first type of lexical hierarchy is „taxonymy‟

or „classificatory hierarchy‟, in which vertical relation is taxonymy (a variety of

hyponymy) and the horizontal relation is co-taxonymy (a variety of

incompatibility). An example of „part of‟ taxonymy is given in Figure 9.

A useful diagnostic frame for taxonymy is:

An X is a kind/type of Y (Cruse 1986, p.137).

78
plant (flora)

flower vegetables trees herbs

rose lily .... brinjal tomato .... sal .... rosemary ....

Figure 9: Lexical hierarchy of plant (flora)

Meronymy

Meronymy denotes the part-whole relationship. Meronymy falls under the

branching hierarchy as the part-whole relationship is represented in the form of a

classification or tree diagram.

Meronymy is subject to a greater number of complicating factors than

taxonomic relations are; instead of there being a single clearly distinguished

relation, there is in reality a numerous family of more-or-less similar relations.

Virtually all word pairs which one would wish to recognise as having a

meronymic relation will yield normal sentences in the test frame: A Y has Xs/an X

(Cruse 1986, p. 160).

Cruse (2000) describes meronymy as a type of lexical hierarchy in which

the relation of dominance is meronymy, and the relation of differentiation is co-

meronymy. Meronomic hierarchy is also known as the part-whole relationship.

Thus, the meronomic hierarchies depict the part-whole relationship between

lexical items such as „part-of‟ or „has-a‟.

79
Figure 10 illustrates a portion of a part-whole hierarchy.

rose plant

flower stem leaf ......

petals pistil ... axil node ... blade vein ...

Figure 10: Lexical hierarchy of „rose plant‟

2.7. Computational Lexical Semantics

Computational lexical semantics implies the construction of large lexical

data that go beyond traditional NLP lexicons from a highly flexible, dynamic

interpretation processing.

Dizier and Viegas (2005) elucidates that lexical semantics has become a

major research area within computational linguistics, drawing from

psycholinguistics, knowledge representation, computer algorithms and

architecture.

Research programmes whose goal is the definition of large lexicons are

asking what the appropriate representation structure is for different facets of

lexical information. Among these facets, semantic information is probably the

most complex and the least explored.

80
2.8. Propositional Relations

Propositional relations are those relations that shows systematize sentences or

statements. Poythress (1982, p. 159, 162) explains that statements, sentences,

commands and questions are not put together haphazardly; rather they are

organized, consolidated and connected to each other through certain relations. He

also discusses about the three major types of propositional relations which are,

one, there are relations of dynamicity, which is, cause-effect relations. Second,

there are relations of determinates or definiteness where two or more propositions

are connected to one another primarily by the fact that a common topic is shared

and third, the relations of coherence where two propositions are connected due to

the fact that they denote events or states connected in time or space.

Zimmermann and Sternefeld (2013), has explained that in semantics, the

technical term for this information is the proposition expressed by the sentence.

The truth values in (II) may differ because the propositions expressed in (I) do.

Thus the relation of sentence (I) and (II) shows the propositional relations.

Consider the following two sentences:

(I) a. Hamburg is larger than Cologne

b. Pfäffingen is larger than Breitenholz

It so happens that both sentences are true, (I) which means that they have

the same extension; indeed their extensions can be calculated from the extensions

of the names and the relation larger than (the set of pairs 〈x, y〉 where x is larger

than y).

But now consider so-called propositional attitude reports, i.e., sentences

that tell us something about a person‟s information state:

81
(II) a. John knows that Hamburg is larger than Cologne

b. John knows that Pfäffingen is larger than Breitenholz

2.9. Earlier Works

There have been numerous studies which have been done on lexical semantics in

many of the world‟s languages and some of the main contributors towards lexical

semantics are Lyons (1963, 1977 & 1981), Lepschy (1970), Palmer (1976), Fodor

(1977), Newmeyer (1980), Leech (1981), Cruse (1986), Harris (1993) and

Geeraerts (2006). They have also contributed to different branches of semantics

including lexical semantics, generative semantics and interpretive semantics

which were done mainly in English. Though many grammar books have been

produced for almost all the major languages of India, very few comprehensive

works have been done particularly in lexical semantics.

Some of the significant contributions to semantics on Indian languages

include Ilakkuvanar (1961) who worked on the semantic in Tamil, Hook (1974)

who worked on compound verb in Hindi and Semantic neutralization in complex

predicate in east and south-east Asian languages, Aiyar (1975) has worked

extensively on many Dravidian languages while Abbi (1980) worked on semantic

grammar in Hindi and correlates of India as a linguistic area. Srimanaramayana

(1984) has also contributed to the semantic of Sanskritalso published a Garo

grammar.

There are hardly any works on the semantics of Garo except for Burlings

(1956 and 1961) who described the Lexico-Statistic dating of Boro-Garo

linguistics and also published a Garo Grammar. Walling (2010) worked on the
82
semantic agent in Tibeto-Burman languages and Matisoff (1978 and 2012) also

published a book on the variational semantics in Tibeto-Burman languages. Turin

and Zeisler (2011) worked and published on the Himalayan Languages and

Linguistics where they focussed on the phonology, semantics, morphology and

syntax on the Austro-Asiatic as well as Tibeto-Burman languages hailing in the

Himalayan areas.

83

You might also like