1-19 of 19 Results  for:

  • Keywords: semantics x
Clear all

Article

Floris Roelofsen

This survey article discusses two basic issues that semantic theories of questions face. The first is how to conceptualize and formally represent the semantic content of questions. This issue arises in particular because the standard truth-conditional notion of meaning, which has been fruitful in the analysis of declarative statements, is not applicable to questions. This is because questions are not naturally construed as being true or false. Instead, it has been proposed that the semantic content of a question must be characterized in terms of its answerhood or resolution conditions. This article surveys a number of theories which develop this basic idea in different ways, focusing on so-called proposition-set theories (alternative semantics, partition semantics, and inquisitive semantics). The second issue that will be considered here concerns questions that are embedded within larger sentences. Within this domain, one important puzzle is why certain predicates can take both declarative and interrogative complements (e.g., Bill knows that Mary called / Bill knows who called), while others take only declarative complements (e.g., Bill thinks that Mary called / *Bill thinks who called) or only interrogative complements (e.g., Bill wonders who called / *Bill wonders that Mary called). We compare two general approaches that have been pursued in the literature. One assumes that declarative and interrogative complements differ in semantic type. On this approach, the fact that predicates like think do not take interrogative complements can be accounted for by assuming that such complements do not have the semantic type that think selects for. The other approach treats the two kinds of complement as having the same semantic type, and seeks to connect the selectional restrictions of predicates like think to other semantic properties (e.g., the fact that think is neg-raising).

Article

Francis Jeffry Pelletier

Most linguists have heard of semantic compositionality. Some will have heard that it is the fundamental truth of semantics. Others will have been told that it is so thoroughly and completely wrong that it is astonishing that it is still being taught. The present article attempts to explain all this. Much of the discussion of semantic compositionality takes place in three arenas that are rather insulated from one another: (a) philosophy of mind and language, (b) formal semantics, and (c) cognitive linguistics and cognitive psychology. A truly comprehensive overview of the writings in all these areas is not possible here. However, this article does discuss some of the work that occurs in each of these areas. A bibliography of general works, and some Internet resources, will help guide the reader to some further, undiscussed works (including further material in all three categories).

Article

Annie Zaenen

Hearers and readers make inferences on the basis of what they hear or read. These inferences are partly determined by the linguistic form that the writer or speaker chooses to give to her utterance. The inferences can be about the state of the world that the speaker or writer wants the hearer or reader to conclude are pertinent, or they can be about the attitude of the speaker or writer vis-à-vis this state of affairs. The attention here goes to the inferences of the first type. Research in semantics and pragmatics has isolated a number of linguistic phenomena that make specific contributions to the process of inference. Broadly, entailments of asserted material, presuppositions (e.g., factive constructions), and invited inferences (especially scalar implicatures) can be distinguished. While we make these inferences all the time, they have been studied piecemeal only in theoretical linguistics. When attempts are made to build natural language understanding systems, the need for a more systematic and wholesale approach to the problem is felt. Some of the approaches developed in Natural Language Processing are based on linguistic insights, whereas others use methods that do not require (full) semantic analysis. In this article, I give an overview of the main linguistic issues and of a variety of computational approaches, especially those stimulated by the RTE challenges first proposed in 2004.

Article

The central goal of the Lexical Semantic Framework (LSF) is to characterize the meaning of simple lexemes and affixes and to show how these meanings can be integrated in the creation of complex words. LSF offers a systematic treatment of issues that figure prominently in the study of word formation, such as the polysemy question, the multiple-affix question, the zero-derivation question, and the form and meaning mismatches question. LSF has its source in a confluence of research approaches that follow a decompositional approach to meaning and, thus, defines simple lexemes and affixes by way of a systematic representation that is achieved via a constrained formal language that enforces consistency of annotation. Lexical-semantic representations in LSF consist of two parts: the Semantic/Grammatical Skeleton and the Semantic/Pragmatic Body (henceforth ‘skeleton’ and ‘body’ respectively). The skeleton is comprised of features that are of relevance to the syntax. These features act as functions and may take arguments. Functions and arguments of a skeleton are hierarchically arranged. The body encodes all those aspects of meaning that are perceptual, cultural, and encyclopedic. Features in LSF are used in (a) a cross-categorial, (b) an equipollent, and (c) a privative way. This means that they are used to account for the distinction between the major ontological categories, may have a binary (i.e., positive or negative) value, and may or may not form part of the skeleton of a given lexeme. In order to account for the fact that several distinct parts integrate into a single referential unit that projects its arguments to the syntax, LSF makes use of the Principle of Co-indexation. Co-indexation is a device needed in order to tie together the arguments that come with different parts of a complex word to yield only those arguments that are syntactically active. LSF has an important impact on the study of the morphology-lexical semantics interface and provides a unitary theory of meaning in word formation.

Article

The investigation of morphology and lexical semantics is an investigation into the very essence of the semantics of word formation: the meaning of morphemes and how they can be combined to form meanings of complex words. Discussion of this question within the scholarly literature has been dependent on (i) the adopted morphological model (morpheme-based or word-based); and (ii) the adopted theoretical paradigm (such as formal/generativist accounts vs. construction-based approaches)—which also determined what problem areas received attention in the first place. One particular problem area that has surfaced most consistently within the literature (irrespective of the adopted morphological model or theoretical paradigm) is the so-called semantic mismatch question, which also serves as the focus of the present chapter. In essence, semantic mismatch pertains to the question of why there is no one-to-one correspondence between form and meaning in word formation. In other words, it is very frequently not possible out of context to give a precise account of what the meaning of a newly coined word might be based simply on the constituents that the word originates from. The article considers the extent to which the meaning of complex words is (at least partly) based on nondecompositional knowledge, implying that the meaning-bearing feature of morphemes might in fact be a graded affair. Thus, depending on the entrenchment and strength of the interrelations among sets of words, the meaning of the components contributes only more or less to a meaning of a word, suggesting that “mismatches” might be neither unusual nor uncommon.

Article

Dirk Geeraerts

Lexical semantics is the study of word meaning. Descriptively speaking, the main topics studied within lexical semantics involve either the internal semantic structure of words, or the semantic relations that occur within the vocabulary. Within the first set, major phenomena include polysemy (in contrast with vagueness), metonymy, metaphor, and prototypicality. Within the second set, dominant topics include lexical fields, lexical relations, conceptual metaphor and metonymy, and frames. Theoretically speaking, the main theoretical approaches that have succeeded each other in the history of lexical semantics are prestructuralist historical semantics, structuralist semantics, and cognitive semantics. These theoretical frameworks differ as to whether they take a system-oriented rather than a usage-oriented approach to word-meaning research but, at the same time, in the historical development of the discipline, they have each contributed significantly to the descriptive and conceptual apparatus of lexical semantics.

Article

Haihua Pan and Yuli Feng

Cross-linguistic data can add new insights to the development of semantic theories or even induce the shift of the research paradigm. The major topics in semantic studies such as bare noun denotation, quantification, degree semantics, polarity items, donkey anaphora and binding principles, long-distance reflexives, negation, tense and aspects, eventuality are all discussed by semanticists working on the Chinese language. The issues which are of particular interest include and are not limited to: (i) the denotation of Chinese bare nouns; (ii) categorization and quantificational mapping strategies of Chinese quantifier expressions (i.e., whether the behaviors of Chinese quantifier expressions fit into the dichotomy of A-Quantification and D-quantification); (iii) multiple uses of quantifier expressions (e.g., dou) and their implication on the inter-relation of semantic concepts like distributivity, scalarity, exclusiveness, exhaustivity, maximality, etc.; (iv) the interaction among universal adverbials and that between universal adverbials and various types of noun phrases, which may pose a challenge to the Principle of Compositionality; (v) the semantics of degree expressions in Chinese; (vi) the non-interrogative uses of wh-phrases in Chinese and their influence on the theories of polarity items, free choice items, and epistemic indefinites; (vii) how the concepts of E-type pronouns and D-type pronouns are manifested in the Chinese language and whether such pronoun interpretations correspond to specific sentence types; (viii) what devices Chinese adopts to locate time (i.e., does tense interpretation correspond to certain syntactic projections or it is solely determined by semantic information and pragmatic reasoning); (ix) how the interpretation of Chinese aspect markers can be captured by event structures, possible world semantics, and quantification; (x) how the long-distance binding of Chinese ziji ‘self’ and the blocking effect by first and second person pronouns can be accounted for by the existing theories of beliefs, attitude reports, and logophoricity; (xi) the distribution of various negation markers and their correspondence to the semantic properties of predicates with which they are combined; and (xii) whether Chinese topic-comment structures are constrained by both semantic and pragmatic factors or syntactic factors only.

Article

There are two main theoretical traditions in semantics. One is based on realism, where meanings are described as relations between language and the world, often in terms of truth conditions. The other is cognitivistic, where meanings are identified with mental structures. This article presents some of the main ideas and theories within the cognitivist approach. A central tenet of cognitively oriented theories of meaning is that there are close connections between the meaning structures and other cognitive processes. In particular, parallels between semantics and visual processes have been studied. As a complement, the theory of embodied cognition focuses on the relation between actions and components of meaning. One of the main methods of representing cognitive meaning structures is to use images schemas and idealized cognitive models. Such schemas focus on spatial relations between various semantic elements. Images schemas are often constructed using Gestalt psychological notions, including those of trajector and landmark, corresponding to figure and ground. In this tradition, metaphors and metonymies are considered to be central meaning transforming processes. A related approach is force dynamics. Here, the semantic schemas are construed from forces and their relations rather than from spatial relations. Recent extensions involve cognitive representations of actions and events, which then form the basis for a semantics of verbs. A third approach is the theory of conceptual spaces. In this theory, meanings are represented as regions of semantic domains such as space, time, color, weight, size, and shape. For example, strong evidence exists that color words in a large variety of languages correspond to such regions. This approach has been extended to a general account of the semantics of some of the main word classes, including adjectives, verbs, and prepositions. The theory of conceptual spaces shows similarities to the older frame semantics and feature analysis, but it puts more emphasis on geometric structures. A general criticism against cognitive theories of semantics is that they only consider the meaning structures of individuals, but neglect the social aspects of semantics, that is, that meanings are shared within a community. Recent theoretical proposals counter this by suggesting that semantics should be seen as a meeting of minds, that is, communicative processes that lead to the alignment of meanings between individuals. On this approach, semantics is seen as a product of communication, constrained by the cognitive mechanisms of the individuals.

Article

Donka Farkas

Nominal reference is central to both linguistic semantics and philosophy of language. On the theoretical side, both philosophers and linguists wrestle with the problem of how the link between nominal expressions and their referents is to be characterized, and what formal tools are most appropriate to deal with this issue. The problem is complex because nominal expression come in a large variety of forms, from simple proper names, pronouns, or bare nouns (Jennifer, they, books) to complex expressions involving determiners and various quantifiers (the/every/no/their answer). While the reference of such expressions is varied, their basic syntactic distribution as subjects or objects of various types, for instance, is homogeneous. Important advances in understanding this tension were made with the advent of the work of R. Montague and that of his successors. The problems involved in understanding the relationship between pronouns and their antecedents in discourse have led to another fundamental theoretical development, namely that of dynamic semantics. On the empirical side, issues at the center of both linguistic and philosophical investigations concern how to best characterize the difference between definite and indefinite nominals, and, more generally, how to understand the large variety of determiner types found both within a language and cross-linguistically. These considerations led to refining the definite/indefinite contrast to include fine-grained specificity distinctions that have been shown to be relevant to various morphosyntactic phenomena across the world’s languages. Considerations concerning nominal reference are thus relevant not only to semantics but also to morphology and syntax. Some questions within the domain of nominal reference have grown into rich subfields of inquiry. This is the case with generic reference, the study of pronominal reference, the study of quantifiers, and the study of the semantics of nominal number marking.

Article

The category of Personal/Participant/Inhabitant derived nouns comprises a conglomeration of derived nouns that denote among others agents, instruments, patients/themes, inhabitants, and followers of a person. Based on the thematic relations between the derived noun and its base lexeme, Personal/Participant/Inhabitant nouns can be classified into two subclasses. The first subclass comprises derived nouns that are deverbal and carry thematic readings (e.g., driver). The second subclass consists of derived nouns with athematic readings (e.g., Marxist). The examination of the category of Personal/Participant/Inhabitant nouns allows one to delve deeply into the study of multiplicity of meaning in word formation and the factors that bear on the readings of derived words. These factors range from the historical mechanisms that lead to multiplicity of meaning and the lexical-semantic properties of the bases that derived nouns are based on, to the syntactic context into which derived nouns occur, and the pragmatic-encyclopedic facets of both the base and the derived lexeme.

Article

Jesús Fernández-Domínguez

The onomasiological approach is a theoretical framework that emphasizes the cognitive-semantic component of language and the primacy of extra-linguistic reality in the process of naming. With a tangible background in the functional perspective of the Prague School of Linguistics, this approach believes that name giving is essentially governed by the needs of language users, and hence assigns a subordinate role to the traditional levels of linguistic description. This stance characterizes the onomasiological framework in opposition to other theories of language, especially generativism, which first tackle the form of linguistic material and then move on to meaning. The late 20th and early 21st centuries have witnessed the emergence of several cognitive-onomasiological models, all of which share an extensive use of semantic categories as working units and a particular interest in the area of word-formation. Despite a number of divergences, such proposals all confront mainstream morphological research by heavily revising conventional concepts and introducing model-specific terminology regarding, for instance, the independent character of the lexicon, the (non-)regularity of word-formation processes, or their understanding of morphological productivity. The models adhering to such a view of language have earned a pivotal position as an alternative to dominant theories of word-formation.

Article

Natalia Beliaeva

Blending is a type of word formation in which two or more words are merged into one so that the blended constituents are either clipped, or partially overlap. An example of a typical blend is brunch, in which the beginning of the word breakfast is joined with the ending of the word lunch. In many cases such as motel (motor + hotel) or blizzaster (blizzard + disaster) the constituents of a blend overlap at segments that are phonologically or graphically identical. In some blends, both constituents retain their form as a result of overlap, for example, stoption (stop + option). These examples illustrate only a handful of the variety of forms blends may take; more exotic examples include formations like Thankshallowistmas (Thanksgiving + Halloween + Christmas). The visual and audial amalgamation in blends is reflected on the semantic level. It is common to form blends meaning a combination or a product of two objects or phenomena, such as an animal breed (e.g., zorse, a breed of zebra and horse), an interlanguage variety (e.g., franglais, which is a French blend of français and anglais meaning a mixture of French and English languages), or other type of mix (e.g., a shress is a type of clothes having features of both a shirt and a dress). Blending as a word formation process can be regarded as a subtype of compounding because, like compounds, blends are formed of two (or sometimes more) content words and semantically either are hyponyms of one of their constituents, or exhibit some kind of paradigmatic relationships between the constituents. In contrast to compounds, however, the formation of blends is restricted by a number of phonological constraints given that the resulting formation is a single word. In particular, blends tend to be of the same length as the longest of their constituent words, and to preserve the main stress of one of their constituents. Certain regularities are also observed in terms of ordering of the words in a blend (e.g., shorter first, more frequent first), and in the position of the switch point, that is, where one blended word is cut off and switched to another (typically at the syllable boundary or at the onset/rime boundary). The regularities of blend formation can be related to the recognizability of the blended words.

Article

Stergios Chatzikyriakidis and Robin Cooper

Type theory is a regime for classifying objects (including events) into categories called types. It was originally designed in order to overcome problems relating to the foundations of mathematics relating to Russell’s paradox. It has made an immense contribution to the study of logic and computer science and has also played a central role in formal semantics for natural languages since the initial work of Richard Montague building on the typed λ-calculus. More recently, type theories following in the tradition created by Per Martin-Löf have presented an important alternative to Montague’s type theory for semantic analysis. These more modern type theories yield a rich collection of types which take on a role of representing semantic content rather than simply structuring the universe in order to avoid paradoxes.

Article

Philippe Schlenker, Emmanuel Chemla, and Klaus Zuberbühler

Rich data gathered in experimental primatology in the last 40 years are beginning to benefit from analytical methods used in contemporary linguistics, especially in the area of semantics and pragmatics. These methods have started to clarify five questions: (i) What morphology and syntax, if any, do monkey calls have? (ii) What is the ‘lexical meaning’ of individual calls? (iii) How are the meanings of individual calls combined? (iv) How do calls or call sequences compete with each other when several are appropriate in a given situation? (v) How did the form and meaning of calls evolve? Four case studies from this emerging field of ‘primate linguistics’ provide initial answers, pertaining to Old World monkeys (putty-nosed monkeys, Campbell’s monkeys, and colobus monkeys) and New World monkeys (black-fronted Titi monkeys). The morphology mostly involves simple calls, but in at least one case (Campbell’s -oo) one finds a root–suffix structure, possibly with a compositional semantics. The syntax is in all clear cases simple and finite-state. With respect to meaning, nearly all cases of call concatenation can be analyzed as being semantically conjunctive. But a key question concerns the division of labor between semantics, pragmatics, and the environmental context (‘world’ knowledge and context change). An apparent case of dialectal variation in the semantics (Campbell’s krak) can arguably be analyzed away if one posits sufficiently powerful mechanisms of competition among calls, akin to scalar implicatures. An apparent case of noncompositionality (putty-nosed pyow–hack sequences) can be analyzed away if one further posits a pragmatic principle of ‘urgency’. Finally, rich Titi sequences in which two calls are re-arranged in complex ways so as to reflect information about both predator identity and location are argued not to involve a complex syntax/semantics interface, but rather a fine-grained interaction between simple call meanings and the environmental context. With respect to call evolution, the remarkable preservation of call form and function over millions of years should make it possible to lay the groundwork for an evolutionary monkey linguistics, illustrated with cercopithecine booms.

Article

Laura A. Michaelis

Meanings are assembled in various ways in a construction-based grammar, and this array can be represented as a continuum of idiomaticity, a gradient of lexical fixity. Constructional meanings are the meanings to be discovered at every point along the idiomaticity continuum. At the leftmost, or ‘fixed,’ extreme of this continuum are frozen idioms, like the salt of the earth and in the know. The set of frozen idioms includes those with idiosyncratic syntactic properties, like the fixed expression by and large (an exceptional pattern of coordination in which a preposition and adjective are conjoined). Other frozen idioms, like the unexceptionable modified noun red herring, feature syntax found elsewhere. At the rightmost, or ‘open’ end of this continuum are fully productive patterns, including the rule that licenses the string Kim blinked, known as the Subject-Predicate construction. Between these two poles are (a) lexically fixed idiomatic expressions, verb-headed and otherwise, with regular inflection, such as chew/chews/chewed the fat; (b) flexible expressions with invariant lexical fillers, including phrasal idioms like spill the beans and the Correlative Conditional, such as the more, the merrier; and (c) specialized syntactic patterns without lexical fillers, like the Conjunctive Conditional (e.g., One more remark like that and you’re out of here). Construction Grammar represents this range of expressions in a uniform way: whether phrasal or lexical, all are modeled as feature structures that specify phonological and morphological structure, meaning, use conditions, and relevant syntactic information (including syntactic category and combinatoric potential).

Article

Paolo Acquaviva

Number is the category through which languages express information about the individuality, numerosity, and part structure of what we speak about. As a linguistic category it has a morphological, a morphosyntactic, and a semantic dimension, which are variously interrelated across language systems. Number marking can apply to a more or less restricted part of the lexicon of a language, being most likely on personal pronouns and human/animate nouns, and least on inanimate nouns. In the core contrast, number allows languages to refer to ‘many’ through the description of ‘one’; the sets referred to consist of tokens of the same type, but also of similar types, or of elements pragmatically associated with one named individual. In other cases, number opposes a reading of ‘one’ to a reading as ‘not one,’ which includes masses; when the ‘one’ reading is morphologically derived from the ‘not one,’ it is called a singulative. It is rare for a language to have no linguistic number at all, since a ‘one–many’ opposition is typically implied at least in pronouns, where the category of person discriminates the speaker as ‘one.’ Beyond pronouns, number is typically a property of nouns and/or determiners, although it can appear on other word classes by agreement. Verbs can also express part-structural properties of events, but this ‘verbal number’ is not isomorphic to nominal number marking. Many languages allow a variable proportion of their nominals to appear in a ‘general’ form, which expresses no number information. The main values of number-marked elements are singular and plural; dual and a much rarer trial also exist. Many languages also distinguish forms interpreted as paucals or as greater plurals, respectively, for small and usually cohesive groups and for generically large ones. A broad range of exponence patterns can express these contrasts, depending on the morphological profile of a language, from word inflections to freestanding or clitic forms; certain choices of classifiers also express readings that can be described as ‘plural,’ at least in certain interpretations. Classifiers can co-occur with other plurality markers, but not when these are obligatory as expressions of an inflectional paradigm, although this is debated, partly because the notion of classifier itself subsumes distinct phenomena. Many languages, especially those with classifiers, encode number not as an inflectional category, but through word-formation operations that express readings associated with plurality, including large size. Current research on number concerns all its morphological, morphosyntactic, and semantic dimensions, in particular the interrelations of them as part of the study of natural language typology and of the formal analysis of nominal phrases. The grammatical and semantic function of number and plurality are particularly prominent in formal semantics and in syntactic theory.

Article

Compound and complex predicates—predicates that consist of two or more lexical items and function as the predicate of a single sentence—present an important class of linguistic objects that pertain to an enormously wide range of issues in the interactions of morphology, phonology, syntax, and semantics. Japanese makes extensive use of compounding to expand a single verb into a complex one. These compounding processes range over multiple modules of the grammatical system, thus straddling the borders between morphology, syntax, phonology, and semantics. In terms of degree of phonological integration, two types of compound predicates can be distinguished. In the first type, called tight compound predicates, two elements from the native lexical stratum are tightly fused and inflect as a whole for tense. In this group, Verb-Verb compound verbs such as arai-nagasu [wash-let.flow] ‘to wash away’ and hare-agaru [sky.be.clear-go.up] ‘for the sky to clear up entirely’ are preponderant in numbers and productivity over Noun-Verb compound verbs such as tema-doru [time-take] ‘to take a lot of time (to finish).’ The second type, called loose compound predicates, takes the form of “Noun + Predicate (Verbal Noun [VN] or Adjectival Noun [AN]),” as in post-syntactic compounds like [sinsya : koonyuu] no okyakusama ([new.car : purchase] GEN customers) ‘customer(s) who purchase(d) a new car,’ where the symbol “:” stands for a short phonological break. Remarkably, loose compounding allows combinations of a transitive VN with its agent subject (external argument), as in [Supirubaagu : seisaku] no eiga ([Spielberg : produce] GEN film) ‘a film/films that Spielberg produces/produced’—a pattern that is illegitimate in tight compounds and has in fact been considered universally impossible in the world’s languages in verbal compounding and noun incorporation. In addition to a huge variety of tight and loose compound predicates, Japanese has an additional class of syntactic constructions that as a whole function as complex predicates. Typical examples are the light verb construction, where a clause headed by a VN is followed by the light verb suru ‘do,’ as in Tomodati wa sinsya o koonyuu (sae) sita [friend TOP new.car ACC purchase (even) did] ‘My friend (even) bought a new car’ and the human physical attribute construction, as in Sensei wa aoi me o site-iru [teacher TOP blue eye ACC do-ing] ‘My teacher has blue eyes.’ In these constructions, the nominal phrases immediately preceding the verb suru are semantically characterized as indefinite and non-referential and reject syntactic operations such as movement and deletion. The semantic indefiniteness and syntactic immobility of the NPs involved are also observed with a construction composed of a human subject and the verb aru ‘be,’ as Gakkai ni wa oozei no sankasya ga atta ‘There was a large number of participants at the conference.’ The constellation of such “word-like” properties shared by these compound and complex predicates poses challenging problems for current theories of morphology-syntax-semantics interactions with regard to such topics as lexical integrity, morphological compounding, syntactic incorporation, semantic incorporation, pseudo-incorporation, and indefinite/non-referential NPs.

Article

Rochelle Lieber

Derivational morphology is a type of word formation that creates new lexemes, either by changing syntactic category or by adding substantial new meaning (or both) to a free or bound base. Derivation may be contrasted with inflection on the one hand or with compounding on the other. The distinctions between derivation and inflection and between derivation and compounding, however, are not always clear-cut. New words may be derived by a variety of formal means including affixation, reduplication, internal modification of various sorts, subtraction, and conversion. Affixation is best attested cross-linguistically, especially prefixation and suffixation. Reduplication is also widely found, with various internal changes like ablaut and root and pattern derivation less common. Derived words may fit into a number of semantic categories. For nouns, event and result, personal and participant, collective and abstract noun are frequent. For verbs, causative and applicative categories are well-attested, as are relational and qualitative derivations for adjectives. Languages frequently also have ways of deriving negatives, relational words, and evaluatives. Most languages have derivation of some sort, although there are languages that rely more heavily on compounding than on derivation to build their lexical stock. A number of topics have dominated the theoretical literature on derivation, including productivity (the extent to which new words can be created with a given affix or morphological process), the principles that determine the ordering of affixes, and the place of derivational morphology with respect to other components of the grammar. The study of derivation has also been important in a number of psycholinguistic debates concerning the perception and production of language.

Article

The noun-modifying clause construction (NMCC) in Japanese is a complex noun phrase in which a prenominal clause is dependent on the head noun. Naturally occurring instances of the construction demonstrate that a single structure, schematized as [[… predicate (finite/adnominal)] Noun], represents a wide range of semantic relations between the head noun and the dependent clause, encompassing some that would be expressed by structurally distinct constructions such as relative clauses, noun complement clauses, and other types of complex noun phrases in other languages, such as English. In that way, the Japanese NMCC demonstrates a clear case of the general noun-modifying construction (GNMCC), that is, an NMCC that has structural uniformity across interpretations that extend beyond the range of relative clauses. One of the notable properties of the Japanese NMCC is that the modifying clause may consist only of the predicate, reflecting the fact that referential density is moderate in Japanese—arguments of a predicate are not required to be overtly expressed either in the main clause or in the modifying clause. Another property of the Japanese NMCC is that there is no explicit marking in the construction that indicates the grammatical or semantic relation between the head noun and the modifying clause. The two major constituents are simply juxtaposed to each other. Successful construal of the intended interpretations of instances of such a construction, in the absence of explicit markings, likely relies on an aggregate of structural, semantic, and pragmatic factors, including the semantic content of the linguistic elements, verb valence information, and the interpreter’s real-world knowledge, in addition to the basic structural information. Researchers with different theoretical approaches have studied Japanese NMCCs or subsets thereof. Syntactic approaches, inspired by generative grammar, have focused mostly on relative clauses and aimed to identify universally recognized syntactic principles. Studies that take the descriptive approach have focused on detailed descriptions and the classification of a wide spectrum of naturally occurring instances of the construction in Japanese. The third and most recent group of studies has emphasized the importance of semantics and pragmatics in accounting for a wide variety of naturally occurring instances. The examination of Japanese NMCCs provides information about the nature of clausal noun modification and affords insights into languages beyond Japanese, as similar phenomena have reportedly been observed crosslinguistically to varying degrees.