1-8 of 8 Results

  • Keywords: metaphor x
Clear all

Article

Myrto Grigoroglou and Anna Papafragou

To become competent communicators, children need to learn that what a speaker means often goes beyond the literal meaning of what the speaker says. The acquisition of pragmatics as a field is the study of how children learn to bridge the gap between the semantic meaning of words and structures and the intended meaning of an utterance. Of interest is whether young children are capable of reasoning about others’ intentions and how this ability develops over time. For a long period, estimates of children’s pragmatic sophistication were mostly pessimistic: early work on a number of phenomena showed that very young communicators were egocentric, oblivious to other interlocutors’ intentions, and overall insensitive to subtle pragmatic aspects of interpretation. Recent years have seen major shifts in the study of children’s pragmatic development. Novel methods and more fine-grained theoretical approaches have led to a reconsideration of older findings on how children acquire pragmatics across a number of phenomena and have produced a wealth of new evidence and theories. Three areas that have generated a considerable body of developmental work on pragmatics include reference (the relation between words or phrases and entities in the world), implicature (a type of inferred meaning that arises when a speaker violates conversational rules), and metaphor (a case of figurative language). Findings from these three domains suggest that children actively use pragmatic reasoning to delimit potential referents for newly encountered words, can take into account the perspective of a communicative partner, and are sensitive to some aspects of implicated and metaphorical meaning. Nevertheless, children’s success with pragmatic communication is fragile and task-dependent.

Article

Elizabeth Closs Traugott

Traditional approaches to semantic change typically focus on outcomes of meaning change and list types of change such as metaphoric and metonymic extension, broadening and narrowing, and the development of positive and negative meanings. Examples are usually considered out of context, and are lexical members of nominal and adjectival word classes. However, language is a communicative activity that is highly dependent on context, whether that of the ongoing discourse or of social and ideological changes. Much recent work on semantic change has focused, not on results of change, but on pragmatic enabling factors for change in the flow of speech. Attention has been paid to the contributions of cognitive processes, such as analogical thinking, production of cues as to how a message is to be interpreted, and perception or interpretation of meaning, especially in grammaticalization. Mechanisms of change such as metaphorization, metonymization, and subjectification have been among topics of special interest and debate. The work has been enabled by the fine-grained approach to contextual data that electronic corpora allow.

Article

Dirk Geeraerts

Lexical semantics is the study of word meaning. Descriptively speaking, the main topics studied within lexical semantics involve either the internal semantic structure of words, or the semantic relations that occur within the vocabulary. Within the first set, major phenomena include polysemy (in contrast with vagueness), metonymy, metaphor, and prototypicality. Within the second set, dominant topics include lexical fields, lexical relations, conceptual metaphor and metonymy, and frames. Theoretically speaking, the main theoretical approaches that have succeeded each other in the history of lexical semantics are prestructuralist historical semantics, structuralist semantics, and cognitive semantics. These theoretical frameworks differ as to whether they take a system-oriented rather than a usage-oriented approach to word-meaning research but, at the same time, in the historical development of the discipline, they have each contributed significantly to the descriptive and conceptual apparatus of lexical semantics.

Article

M. Teresa Espinal and Jaume Mateu

Idioms, conceived as fixed multi-word expressions that conceptually encode non-compositional meaning, are linguistic units that raise a number of questions relevant in the study of language and mind (e.g., whether they are stored in the lexicon or in memory, whether they have internal or external syntax similar to other expressions of the language, whether their conventional use is parallel to their non-compositional meaning, whether they are processed in similar ways to regular compositional expressions of the language, etc.). Idioms show some similarities and differences with other sorts of formulaic expressions, the main types of idioms that have been characterized in the linguistic literature, and the dimensions on which idiomaticity lies. Syntactically, idioms manifest a set of syntactic properties, as well as a number of constraints that account for their internal and external structure. Semantically, idioms present an interesting behavior with respect to a set of semantic properties that account for their meaning (i.e., conventionality, compositionality, and transparency, as well as aspectuality, referentiality, thematic roles, etc.). The study of idioms has been approached from lexicographic and computational, as well as from psycholinguistic and neurolinguistic perspectives.

Article

Cognitive linguistics and morphology bear the promise of a happy marriage. Cognitive linguistics provides theoretical concepts and analytical tools for empirical analysis, while morphology offers fertile ground for testing hypotheses and refining core concepts. It is no wonder, then, that numerous contributions to the field of morphology have been couched in cognitive linguistics, and that morphological phenomena have figured prominently in cognitive linguistics. Cognitive linguistics is a family of closely related frameworks that share the idea that language should be analyzed in terms of what is known about the mind and brain from disciplines other than linguistics. Cognitive linguistics furthermore adopts a semiotic perspective, claiming that the raison d’êtreof language is to convey meaning. Another central tenet is the usage-based approach, the idea that grammar emerges through usage, which implies a strong focus on language use in cognitive linguistics. An example of how cognitive linguistics relates morphology to general principles of cognition is the application of general principles of categorization to morphology. Morphological categories are analyzed as radial categories, that is, networks structured around a prototype. Such category networks can be comprised of the allomorphs of a morpheme or be used to model theoretical concepts such as paradigm and inflection class. The radial category is also instrumental in analyzing the meaning of morphological concepts. Rather than assuming abstract invariant meanings for morphemes, cognitive linguistics analyzes the meaning of morphological phenomena through networks of interrelated meanings. The relationships among the nodes in a category network are analyzed in terms of general cognitive processes, such as metaphor, metonymy, and blending. The usage-based approach of cognitive linguistics manifests itself in the strong focus on frequency effects in morphology. It is argued that frequency is an important structuring principle in cognition, and that frequent forms have a privileged status in a morphological paradigm.

Article

Deirdre Wilson

Relevance theory is a cognitive approach to pragmatics which starts from two broadly Gricean assumptions: (a) that much human communication, both verbal and non-verbal, involves the overt expression and inferential recognition of intentions, and (b) that in inferring these intentions, the addressee presumes that the communicator’s behavior will meet certain standards, which for Grice are based on a Cooperative Principle and maxims, and for relevance theory are derived from the assumption that, as a result of constant selection pressures in the course of human evolution, both cognition and communication are relevance-oriented. Relevance is defined in terms of cognitive (or contextual) effects and processing effort: other things being equal, the greater the cognitive effects and the smaller the processing effort, the greater the relevance. A long-standing aim of relevance theory has been to show that building an adequate theory of communication involves going beyond Grice’s notion of speaker’s meaning. Another is to provide a conceptually unified account of how a much broader variety of communicative acts than Grice was concerned with—including cases of both showing that and telling that—are understood. The resulting pragmatic theory differs from Grice’s in several respects. It sees explicit communication as much richer and more inferential than Grice thought, with encoded sentence meanings providing no more than clues to the speaker’s intentions. It rejects the close link that Grice saw between implicit communication and (real or apparent) maxim violation, showing in particular how figurative utterances might arise naturally and spontaneously in the course of communication. It offers an account of vagueness or indeterminacy in communication, which is often abstracted away from in more formally oriented frameworks. It investigates the role of context in comprehension, and shows how tentative hypotheses about the intended combination of explicit content, contextual assumptions, and implicatures might be refined and mutually adjusted in the course of the comprehension process in order to satisfy expectations of relevance. Relevance theory treats the borderline between semantics and pragmatics as co-extensive with the borderline between (linguistic) decoding and (pragmatic) inference. It sees encoded sentence meanings as typically fragmentary and incomplete, and as having to undergo inferential enrichment or elaboration in order to yield fully propositional forms. It reanalyzes Grice’s conventional implicatures—which he saw as semantic but non-truth-conditional aspects of the meaning of words like but and so—as encoding procedural information with dedicated pragmatic or more broadly cognitive functions, and extends the notion of procedural meaning to a range of further items such as pronouns, discourse particles, mood indicators, and affective intonation.

Article

Sándor Martsa

Conversion is traditionally viewed as a word-formation technique of forming a word from a formally identical but categorically different word without adding a(n explicit) morphological exponent. Despite its apparent formal simplicity manifested first of all in the sameness of the input and the output, the proper understanding of what exactly happens during conversion, morphosyntactically and semantically alike, is by no means an easy matter even in respect of one language, let alone languages representing different typological groups or subgroups. To determine the linguistic status of conversion and its place among other types of word formation is not a simple matter either, and, paradoxically, it is especially so in the case of the most extensively studied English conversion. The reason for this is that the traditional view of conversion has often been called into question, giving rise to a diversity of interpretations of conversion not only in English but also in a cross-linguistic perspective. Conversion research has gone a long way to explore the mechanism of conversion as a kind of word formation; nevertheless, further research is necessary to understand every detail of this mechanism.

Article

Agustín Vicente and Ingrid L. Falkum

Polysemy is characterized as the phenomenon whereby a single word form is associated with two or several related senses. It is distinguished from monosemy, where one word form is associated with a single meaning, and homonymy, where a single word form is associated with two or several unrelated meanings. Although the distinctions between polysemy, monosemy, and homonymy may seem clear at an intuitive level, they have proven difficult to draw in practice. Polysemy proliferates in natural language: Virtually every word is polysemous to some extent. Still, the phenomenon has been largely ignored in the mainstream linguistics literature and in related disciplines such as philosophy of language. However, polysemy is a topic of relevance to linguistic and philosophical debates regarding lexical meaning representation, compositional semantics, and the semantics–pragmatics divide. Early accounts treated polysemy in terms of sense enumeration: each sense of a polysemous expression is represented individually in the lexicon, such that polysemy and homonymy were treated on a par. This approach has been strongly criticized on both theoretical and empirical grounds. Since at least the 1990s, most researchers converge on the hypothesis that the senses of at least many polysemous expressions derive from a single meaning representation, though the status of this representation is a matter of vivid debate: Are the lexical representations of polysemous expressions informationally poor and underspecified with respect to their different senses? Or do they have to be informationally rich in order to store and be able to generate all these polysemous senses? Alternatively, senses might be computed from a literal, primary meaning via semantic or pragmatic mechanisms such as coercion, modulation or ad hoc concept construction (including metaphorical and metonymic extension), mechanisms that apparently play a role also in explaining how polysemy arises and is implicated in lexical semantic change.