1-20 of 23 Results  for:

  • Linguistic Theories x
Clear all

Article

Malka Rappaport Hovav

Words are sensitive to syntactic context. Argument realization is the study of the relation between argument-taking words, the syntactic contexts they appear in and the interpretive properties that constrain the relation between them.

Article

Bracketing paradoxes—constructions whose morphosyntactic and morpho-phonological structures appear to be irreconcilably at odds (e.g., unhappier)—are unanimously taken to point to truths about the derivational system that we have not yet grasped. Consider that the prefix un- must be structurally separate in some way from happier both for its own reasons (its [n] surprisingly does not assimilate in Place to a following consonant (e.g., u[n]popular)), and for reasons external to the prefix (the suffix -er must be insensitive to the presence of un-, as the comparative cannot attach to bases of three syllables or longer (e.g., *intelligenter)). But, un- must simultaneously be present in the derivation before -er is merged, so that unhappier can have the proper semantic reading (‘more unhappy’, and not ‘not happier’). Bracketing paradoxes emerged as a problem for generative accounts of both morphosyntax and morphophonology only in the 1970s. With the rise of restrictions on and technology used to describe and represent the behavior of affixes (e.g., the Affix-Ordering Generalization, Lexical Phonology and Morphology, the Prosodic Hierarchy), morphosyntacticians and phonologists were confronted with this type of inconsistent derivation in many unrelated languages.

Article

Cognitive semantics (CS) is an approach to the study of linguistic meaning. It is based on the assumption that the human linguistic capacity is part of our cognitive abilities, and that language in general and meaning in particular can therefore be better understood by taking into account the cognitive mechanisms that control the conceptual and perceptual processing of extra-linguistic reality. Issues central to CS are (a) the notion of prototype and its role in the description of language, (b) the nature of linguistic meaning, and (c) the functioning of different types of semantic relations. The question concerning the nature of meaning is an issue that is particularly controversial between CS on the one hand and structuralist and generative approaches on the other hand: is linguistic meaning conceptual, that is, part of our encyclopedic knowledge (as is claimed by CS), or is it autonomous, that is, based on abstract and language-specific features? According to CS, the most important types of semantic relations are metaphor, metonymy, and different kinds of taxonomic relations, which, in turn, can be further broken down into more basic associative relations such as similarity, contiguity, and contrast. These play a central role not only in polysemy and word formation, that is, in the lexicon, but also in the grammar.

Article

Connectionism is an important theoretical framework for the study of human cognition and behavior. Also known as Parallel Distributed Processing (PDP) or Artificial Neural Networks (ANN), connectionism advocates that learning, representation, and processing of information in mind are parallel, distributed, and interactive in nature. It argues for the emergence of human cognition as the outcome of large networks of interactive processing units operating simultaneously. Inspired by findings from neural science and artificial intelligence, connectionism is a powerful computational tool, and it has had profound impact on many areas of research, including linguistics. Since the beginning of connectionism, many connectionist models have been developed to account for a wide range of important linguistic phenomena observed in monolingual research, such as speech perception, speech production, semantic representation, and early lexical development in children. Recently, the application of connectionism to bilingual research has also gathered momentum. Connectionist models are often precise in the specification of modeling parameters and flexible in the manipulation of relevant variables in the model to address relevant theoretical questions, therefore they can provide significant advantages in testing mechanisms underlying language processes.

Article

Denominal verbs are verbs formed from nouns by means of various word-formation processes such as derivation, conversion, or less common mechanisms like reduplication, change of pitch, or root and pattern. Because their well-formedness is determined by morphosyntactic, phonological, and semantic constraints, they have been analyzed from a variety of lexicalist and non-lexicalist perspectives, including Optimality Theory, Lexical Semantics, Cognitive Grammar, Onomasiology, and Neo-Construction Grammar. Independently of their structural shape, denominal verbs have in common that they denote events in which the referents of their base nouns (e.g., computer in the case of computerize) participate in a non-arbitrary way. While traditional labels like ‘ornative’, ‘privative’, ‘locative’, ‘instrumental’ and the like allow for a preliminary classification of denominal verbs, a more formal description has to account for at least three basic aspects, namely (1) competition among functionally similar word-formation patterns, (2) the polysemy of affixes, which precludes a neat one-to-one relation between derivatives displaying a particular affix and a particular semantic class, and (3) the relevance of generic knowledge and contextual information for the interpretation of (innovative) denominal verbs.

Article

Klaus Abels

Displacement is a ubiquitous phenomenon in natural languages. Grammarians often speak of displacement in cases where the rules for the canonical word order of a language lead to the expectation of finding a word or phrase in a particular position in the sentence whereas it surfaces instead in a different position and the canonical position remains empty: ‘Which book did you buy?’ is an example of displacement because the noun phrase ‘which book’, which acts as the grammatical object in the question, does not occur in the canonical object position, which in English is after the verb. Instead, it surfaces at the beginning of the sentence and the object position remains empty. Displacement is often used as a diagnostic for constituent structure because it affects only (but not all) constituents. In the clear cases, displaced constituents show properties associated with two distinct linear and hierarchical positions. Typically, one of these two positions c-commands the other and the displaced element is pronounced in the c-commanding position. Displacement also shows strong interactions with the path between the empty canonical position and the position where the element is pronounced: one often encounters morphological changes along this path and evidence for structural placement of the displaced constituent, as well as constraints on displacement induced by the path. The exact scope of displacement as an analytically unified phenomenon varies from theory to theory. If more then one type of syntactic displacement is recognized, the question of the interaction between movement types arises. Displacement phenomena are extensively studied by syntacticians. Their enduring interest derives from the fact that the complex interactions between displacement and other aspects of syntax offer a powerful probe into the inner workings and architecture of the human syntactic faculty.

Article

Focus is key to understanding processes of syntactic and prosodic readjustments in the Romance languages. Since, prosodically, it must be the most prominent constituent in the sentence, focus associates with the nuclear pitch accent, which may be shifted from its default rightmost position when the syntactic position of the focus also changes. The application of specific syntactic operations depends both on the size and on the subtype of focus, although not always unambiguously. Subject inversion characterizes focus structures where the domain of focus covers either the whole sentence (broad-focus) or a single constituent (narrow-focus). Presentational constructions distinctively mark broad focus, avoiding potential ambiguity with an SVO structure where the predicate is the focus and the subject is interpreted as topic. In narrow-focus structures, the focus constituent typically occurs sentence-final (postverbal focalization), but it may also be fronted (focus fronting), depending on the specific interpretation associated with the focus. Semantically, focus indicates the presence of alternatives, and the different interpretations arise from the way the set of alternatives is pragmatically exploited, giving rise to a contextually open set (information focus), to contrast or correction (contrastive or corrective focus), or to surprise or unexpectedness (mirative focus). Whether a subtype of focus may undergo fronting in a Romance language is subject to variation. In most varieties it is indeed possible with contrastive or corrective focus, but it has been shown that focus fronting is also acceptable with noncontrastive focus in several languages, especially with mirative focus. Finally, certain focus-sensitive operators or particles directly interact with the narrow-focus constituent of the sentence and their association with focus has semantic effects on the interpretation of the sentence.

Article

Olaf Koeneman and Hedde Zeijlstra

The relation between the morphological form of a pronoun and its semantic function is not always transparent, and syncretism abounds in natural languages. In a language like English, for instance, three types of indefinite pronouns can be identified, often grouped in series: the some-series, the any-series, and the no-series. However, this does not mean that there are also three semantic functions for indefinite pronouns. Haspelmath (1997), in fact distinguishes nine functions. Closer inspection shows that these nine functions must be reduced to four main functions of indefinites, each with a number of subfunctions: (i) Negative Polarity Items; (ii) Free-Choice Items; (iii) negative indefinites; and (iv) positive or existential indefinites. These functions and subfunctions can be morphologically realized differently across languages, but don’t have to. In English, functions (i) and (ii), unlike (iii) and (iv), may morphologically group together, both expressed by the any-series. Where morphological correspondences between the kinds of functions that indefinites may express call for a classification, such classifications turn out to be semantically well motivated too. Similar observations can be made for definite pronouns, where it turns out that various functions, such as the first person inclusive/exclusive distinction or dual number, are sometimes, but not always morphologically distinguished, showing that these may be subfunctions of higher, more general functions. The question as to how to demarcate the landscape of indefinite and definite pronouns thus does not depend on semantic differences alone: Morphological differences are at least as much telling. The interplay between morphological and semantic properties can provide serious answers to how to define indefinites and the various forms and functions that these may take on.

Article

Ariel Cohen

Generics are sentences such as Birds fly, which express generalizations. They are prevalent in speech, and as far as is known, no human language lacks generics. Yet, it is very far from clear what they mean. After all, not all birds fly—penguins don’t! There are two general views about the meaning of generics in the literature, and each view encompasses many specific theories. According to the inductivist view, a generic states that a sufficient number of individuals satisfy a certain property—in the example above, it says that sufficiently many birds fly. This view faces the complicated problem of spelling out exactly how many is “sufficiently many” in a way that correctly captures the intuitive truth conditions of generics. An alternative, the rules and regulations view, despairs from this project and proposes instead that generics directly express rules in the world. Rules are taken to be abstract objects, which are not related to the properties of specific individuals. This view faces the difficult problem of explaining how people come to know of such rules when judging the truth of falsity of generics, and accounting for the strong intuition that a sentence such as Birds fly talks about birds, not abstract objects. What seems to be beyond dispute is that generics, even if they do not express rules, are lawlike: they state non-accidental generalizations. Many scholars have taken this fact to indicate that generics are parametric on possible worlds: they refer to worlds other than the actual world. This, again, raises the problem of how people come to know about what happens in these other worlds. However, a rigorous application of standard tests for intensionality shows that generics are not, in fact, parametric on possible worlds, but only on time. This unusual property may explain much of the mystery surrounding generics. Another mysterious property of generics is that although there is no language without them, there is no linguistic construction that is devoted to the expression of genericity. Rather, generics can be expressed in a variety of ways, each of which can also express nongenerics. Yet, each manifestation of generics differs subtly (or sometimes not so subtly) in its meaning from the others. Even when these and other puzzles of genericity are solved, one mystery would remain: Why are generics, which are so easy to produce and understand in conversation, so difficult to analyze?

Article

M. Teresa Espinal and Jaume Mateu

Idioms, conceived as fixed multi-word expressions that conceptually encode non-compositional meaning, are linguistic units that raise a number of questions relevant in the study of language and mind (e.g., whether they are stored in the lexicon or in memory, whether they have internal or external syntax similar to other expressions of the language, whether their conventional use is parallel to their non-compositional meaning, whether they are processed in similar ways to regular compositional expressions of the language, etc.). Idioms show some similarities and differences with other sorts of formulaic expressions, the main types of idioms that have been characterized in the linguistic literature, and the dimensions on which idiomaticity lies. Syntactically, idioms manifest a set of syntactic properties, as well as a number of constraints that account for their internal and external structure. Semantically, idioms present an interesting behavior with respect to a set of semantic properties that account for their meaning (i.e., conventionality, compositionality, and transparency, as well as aspectuality, referentiality, thematic roles, etc.). The study of idioms has been approached from lexicographic and computational, as well as from psycholinguistic and neurolinguistic perspectives.

Article

Laura A. Michaelis

Meanings are assembled in various ways in a construction-based grammar, and this array can be represented as a continuum of idiomaticity, a gradient of lexical fixity. Constructional meanings are the meanings to be discovered at every point along the idiomaticity continuum. At the leftmost, or ‘fixed,’ extreme of this continuum are frozen idioms, like the salt of the earth and in the know. The set of frozen idioms includes those with idiosyncratic syntactic properties, like the fixed expression by and large (an exceptional pattern of coordination in which a preposition and adjective are conjoined). Other frozen idioms, like the unexceptionable modified noun red herring, feature syntax found elsewhere. At the rightmost, or ‘open’ end of this continuum are fully productive patterns, including the rule that licenses the string Kim blinked, known as the Subject-Predicate construction. Between these two poles are (a) lexically fixed idiomatic expressions, verb-headed and otherwise, with regular inflection, such as chew/chews/chewed the fat; (b) flexible expressions with invariant lexical fillers, including phrasal idioms like spill the beans and the Correlative Conditional, such as the more, the merrier; and (c) specialized syntactic patterns without lexical fillers, like the Conjunctive Conditional (e.g., One more remark like that and you’re out of here). Construction Grammar represents this range of expressions in a uniform way: whether phrasal or lexical, all are modeled as feature structures that specify phonological and morphological structure, meaning, use conditions, and relevant syntactic information (including syntactic category and combinatoric potential).

Article

Due to the agglutinative character, Japanese and Ryukyuan morphology is predominantly concatenative, applying to garden-variety word formation processes such as compounding, prefixation, suffixation, and inflection, though nonconcatenative morphology like clipping, blending, and reduplication is also available and sometimes interacts with concatenative word formation. The formal simplicity of the principal morphological devices is counterbalanced by their complex interaction with syntax and semantics as well as by the intricate interactions of four lexical strata (native, Sino-Japanese, foreign, and mimetic) with particular morphological processes. A wealth of phenomena is adduced that pertain to central issues in theories of morphology, such as the demarcation between words and phrases; the feasibility of the lexical integrity principle; the controversy over lexicalism and syntacticism; the distinction of morpheme-based and word-based morphology; the effects of the stage-level vs. individual-level distinction on the applicability of morphological rules; the interface of morphology, syntax, and semantics, and pragmatics; and the role of conjugation and inflection in predicate agglutination. In particular, the formation of compound and complex verbs/adjectives takes place in both lexical and syntactic structures, and the compound and complex predicates thus formed are further followed in syntax by suffixal predicates representing grammatical categories like causative, passive, negation, and politeness as well as inflections of tense and mood to form a long chain of predicate complexes. In addition, an array of morphological objects—bound root, word, clitic, nonindependent word or fuzoku-go, and (for Japanese) word plus—participate productively in word formation. The close association of morphology and syntax in Japonic languages thus demonstrates that morphological processes are spread over lexical and syntactic structures, whereas words are equipped with the distinct property of morphological integrity, which distinguishes them from syntactic phrases.

Article

The noun-modifying clause construction (NMCC) in Japanese is a complex noun phrase in which a prenominal clause is dependent on the head noun. Naturally occurring instances of the construction demonstrate that a single structure, schematized as [[… predicate (finite/adnominal)] Noun], represents a wide range of semantic relations between the head noun and the dependent clause, encompassing some that would be expressed by structurally distinct constructions such as relative clauses, noun complement clauses, and other types of complex noun phrases in other languages, such as English. In that way, the Japanese NMCC demonstrates a clear case of the general noun-modifying construction (GNMCC), that is, an NMCC that has structural uniformity across interpretations that extend beyond the range of relative clauses. One of the notable properties of the Japanese NMCC is that the modifying clause may consist only of the predicate, reflecting the fact that referential density is moderate in Japanese—arguments of a predicate are not required to be overtly expressed either in the main clause or in the modifying clause. Another property of the Japanese NMCC is that there is no explicit marking in the construction that indicates the grammatical or semantic relation between the head noun and the modifying clause. The two major constituents are simply juxtaposed to each other. Successful construal of the intended interpretations of instances of such a construction, in the absence of explicit markings, likely relies on an aggregate of structural, semantic, and pragmatic factors, including the semantic content of the linguistic elements, verb valence information, and the interpreter’s real-world knowledge, in addition to the basic structural information. Researchers with different theoretical approaches have studied Japanese NMCCs or subsets thereof. Syntactic approaches, inspired by generative grammar, have focused mostly on relative clauses and aimed to identify universally recognized syntactic principles. Studies that take the descriptive approach have focused on detailed descriptions and the classification of a wide spectrum of naturally occurring instances of the construction in Japanese. The third and most recent group of studies has emphasized the importance of semantics and pragmatics in accounting for a wide variety of naturally occurring instances. The examination of Japanese NMCCs provides information about the nature of clausal noun modification and affords insights into languages beyond Japanese, as similar phenomena have reportedly been observed crosslinguistically to varying degrees.

Article

Numerical expressions are linguistic forms related to numbers or quantities, which directly reflect the relationship between linguistic symbols and mathematical cognition. Featuring some unique properties, numeral systems are somewhat distinguished from other language subsystems. For instance, numerals can appear in various grammatical positions, including adjective positions, determiner positions, and argument positions. Thus, linguistic research on numeral systems, especially the research on the syntax and semantics of numerical expressions, has been a popular and recurrent topic. For the syntax of complex numerals, two analyses have been proposed in the literature. The traditional constituency analysis maintains that complex numerals are phrasal constituents, which has been widely accepted and defended as a null hypothesis. The nonconstituency analysis, by contrast, claims that a complex numeral projects a complementative structure in which a numeral is a nominal head selecting a lexical noun or a numeral-noun combination as its complement. As a consequence, additive numerals are transformed from full NP coordination. Whether numerals denote numbers or sets has aroused a long-running debate. The number-denoting view assumes that numerals refer to numbers, which are abstract objects, grammatically equivalent to nouns. The primary issue with this analysis comes from the introduction of a new entity, numbers, into the model of ontology. The set-denoting view argues that numerals refer to sets, which are equivalent to adjectives or quantifiers in grammar. One main difficulty of this view is how to account for numerals in arithmetic sentences.

Article

Jesús Fernández-Domínguez

The onomasiological approach is a theoretical framework that emphasizes the cognitive-semantic component of language and the primacy of extra-linguistic reality in the process of naming. With a tangible background in the functional perspective of the Prague School of Linguistics, this approach believes that name giving is essentially governed by the needs of language users, and hence assigns a subordinate role to the traditional levels of linguistic description. This stance characterizes the onomasiological framework in opposition to other theories of language, especially generativism, which first tackle the form of linguistic material and then move on to meaning. The late 20th and early 21st centuries have witnessed the emergence of several cognitive-onomasiological models, all of which share an extensive use of semantic categories as working units and a particular interest in the area of word-formation. Despite a number of divergences, such proposals all confront mainstream morphological research by heavily revising conventional concepts and introducing model-specific terminology regarding, for instance, the independent character of the lexicon, the (non-)regularity of word-formation processes, or their understanding of morphological productivity. The models adhering to such a view of language have earned a pivotal position as an alternative to dominant theories of word-formation.

Article

The category of Personal/Participant/Inhabitant derived nouns comprises a conglomeration of derived nouns that denote among others agents, instruments, patients/themes, inhabitants, and followers of a person. Based on the thematic relations between the derived noun and its base lexeme, Personal/Participant/Inhabitant nouns can be classified into two subclasses. The first subclass comprises derived nouns that are deverbal and carry thematic readings (e.g., driver). The second subclass consists of derived nouns with athematic readings (e.g., Marxist). The examination of the category of Personal/Participant/Inhabitant nouns allows one to delve deeply into the study of multiplicity of meaning in word formation and the factors that bear on the readings of derived words. These factors range from the historical mechanisms that lead to multiplicity of meaning and the lexical-semantic properties of the bases that derived nouns are based on, to the syntactic context into which derived nouns occur, and the pragmatic-encyclopedic facets of both the base and the derived lexeme.

Article

This paper provides an overview of polarity phenomena in human languages. There are three prominent paradigms of polarity items: negative polarity items (NPIs), positive polarity items (PPIs), and free choice items (FCIs). What they all have in common is that they have limited distribution: they cannot occur just anywhere, but only inside the scope of licenser, which is negation and more broadly a nonveridical licenser, PPIs, conversely, must appear outside the scope of negation. The need to be in the scope of a licenser creates a semantic and syntactic dependency, as the polarity item must be c-commanded by the licenser at some syntactic level. Polarity, therefore, is a true interface phenomenon and raises the question of well-formedness that depends on both semantics and syntax. Nonveridical polarity contexts can be negative, but also non-monotonic such as modal contexts, questions, other non-assertive contexts (imperatives, subjunctives), generic and habitual sentences, and disjunction. Some NPIs and FCIs appear freely in these contexts in many languages, and some NPIs prefer negative contexts. Within negative licensers, we make a distinction between classically and minimally negative contexts. There are no NPIs that appear only in minimally negative contexts. The distributions of NPIs and FCIs crosslinguistically can be understood in terms of general patterns, and there are individual differences due largely to the lexical semantic content of the polarity item paradigms. Three general patterns can be identified as possible lexical sources of polarity. The first is the presence of a dependent variable in the polarity item—a property characterizing NPIs and FCIs in many languages, including Greek, Mandarin, and Korean. Secondly, the polarity item may be scalar: English any and FCIs can be scalar, but Greek, Korean, and Mandarin NPIs are not. Finally, it has been proposed that NPIs can be exhaustive, but exhaustivity is hard to precisely identify in a non-stipulative way, and does not characterize all NPIs. NPIs that are not exhaustive tend to be referentially vague, which means that the speaker uses them only if she is unable to identify a specific referent for them.

Article

Agustín Vicente and Ingrid L. Falkum

Polysemy is characterized as the phenomenon whereby a single word form is associated with two or several related senses. It is distinguished from monosemy, where one word form is associated with a single meaning, and homonymy, where a single word form is associated with two or several unrelated meanings. Although the distinctions between polysemy, monosemy, and homonymy may seem clear at an intuitive level, they have proven difficult to draw in practice. Polysemy proliferates in natural language: Virtually every word is polysemous to some extent. Still, the phenomenon has been largely ignored in the mainstream linguistics literature and in related disciplines such as philosophy of language. However, polysemy is a topic of relevance to linguistic and philosophical debates regarding lexical meaning representation, compositional semantics, and the semantics–pragmatics divide. Early accounts treated polysemy in terms of sense enumeration: each sense of a polysemous expression is represented individually in the lexicon, such that polysemy and homonymy were treated on a par. This approach has been strongly criticized on both theoretical and empirical grounds. Since at least the 1990s, most researchers converge on the hypothesis that the senses of at least many polysemous expressions derive from a single meaning representation, though the status of this representation is a matter of vivid debate: Are the lexical representations of polysemous expressions informationally poor and underspecified with respect to their different senses? Or do they have to be informationally rich in order to store and be able to generate all these polysemous senses? Alternatively, senses might be computed from a literal, primary meaning via semantic or pragmatic mechanisms such as coercion, modulation or ad hoc concept construction (including metaphorical and metonymic extension), mechanisms that apparently play a role also in explaining how polysemy arises and is implicated in lexical semantic change.

Article

Floris Roelofsen

This survey article discusses two basic issues that semantic theories of questions face. The first is how to conceptualize and formally represent the semantic content of questions. This issue arises in particular because the standard truth-conditional notion of meaning, which has been fruitful in the analysis of declarative statements, is not applicable to questions. This is because questions are not naturally construed as being true or false. Instead, it has been proposed that the semantic content of a question must be characterized in terms of its answerhood or resolution conditions. This article surveys a number of theories which develop this basic idea in different ways, focusing on so-called proposition-set theories (alternative semantics, partition semantics, and inquisitive semantics). The second issue that will be considered here concerns questions that are embedded within larger sentences. Within this domain, one important puzzle is why certain predicates can take both declarative and interrogative complements (e.g., Bill knows that Mary called / Bill knows who called), while others take only declarative complements (e.g., Bill thinks that Mary called / *Bill thinks who called) or only interrogative complements (e.g., Bill wonders who called / *Bill wonders that Mary called). We compare two general approaches that have been pursued in the literature. One assumes that declarative and interrogative complements differ in semantic type. On this approach, the fact that predicates like think do not take interrogative complements can be accounted for by assuming that such complements do not have the semantic type that think selects for. The other approach treats the two kinds of complement as having the same semantic type, and seeks to connect the selectional restrictions of predicates like think to other semantic properties (e.g., the fact that think is neg-raising).

Article

Miguel Casas Gómez and Martin Hummel

Structural semantics is a primarily European structural linguistic approach to the content level of language which basically derives from two historical sources. The main inspiration stems from Ferdinand de Saussure’s Cours de linguistique générale (1916), where the Genevan linguist also formulates the fundamental principles of semantic analysis: the twofold character of the linguistic sign, the inner determination of its content by the—allegedly autonomous—linguistic system, the consequent exclusion of the extralinguistic reality, the notion of opposition inside the system, and the concept of “associative relations” in the domain of semantics. This tradition was later refined by Hjelmslev and Coseriu, who introduced theoretical and methodological strength and rigor, suggesting systematic analyses in terms of semantic features linked by (binary) opposition. The second source of inspiration was the more holistic concept elaborated by Wilhelm von Humboldt, who saw language as a means of structuring the world. In the second half of the 20th century, structural semantics was mainstream semantics (to the extent that semantic analysis was accepted at all). A long series of authors deepened these historical traditions in theoretical and empirical studies, some of them suggesting secondary and/or partial models. Finally, prototype semantics and cognitive semantics strove to downgrade structural semantics by turning back to a more holistic conception of meaning including the speakers’ knowledge of the world, although not without introducing the alternative structural notion of “network.”