The Penutian language family, Penutian phylum, or better still, Penutian hypothesis is one of the largest genealogical linguistic groupings to have been proposed for western North America. It involves 16 families or isolates. Only a few of these families are demonstrably relatable to one another according to current knowledge and diachronic techniques. Sometimes Penutian is split by observers into groups of languages assumed to be interrelated, and this is done without assumptions that the groups themselves are interrelated.
This article focuses on the Canadian and US languages in “Sapir’s Penutian,” the most commonly accepted version; the most southerly family within Penutian is thus held as Yokutsan of California’s Sierra Nevada. It discusses the subclassification of the so-called Penutian languages into families and smaller units; aspects of their phonology, morphosyntax, and contact histories; and issues in their revitalization and the potential reconstruction of Proto-Penutian.
Article
Penutian Languages
Anthony P. Grant
Article
Personal/Participant/Inhabitant in Morphology
Marios Andreou
The category of Personal/Participant/Inhabitant derived nouns comprises a conglomeration of derived nouns that denote among others agents, instruments, patients/themes, inhabitants, and followers of a person. Based on the thematic relations between the derived noun and its base lexeme, Personal/Participant/Inhabitant nouns can be classified into two subclasses. The first subclass comprises derived nouns that are deverbal and carry thematic readings (e.g., driver). The second subclass consists of derived nouns with athematic readings (e.g., Marxist).
The examination of the category of Personal/Participant/Inhabitant nouns allows one to delve deeply into the study of multiplicity of meaning in word formation and the factors that bear on the readings of derived words. These factors range from the historical mechanisms that lead to multiplicity of meaning and the lexical-semantic properties of the bases that derived nouns are based on, to the syntactic context into which derived nouns occur, and the pragmatic-encyclopedic facets of both the base and the derived lexeme.
Article
The Phonology of Compounds
Irene Vogel
A number of recent developments in phonological theory, beginning with The Sound Pattern of English, are particularly relevant to the phonology of compounds. They address both the phonological phenomena that apply to compound words and the phonological structures that are required as the domains of these phenomena: segmental and nonsegmental phenomena that operate within each member of a compound separately, as well as at the juncture between the members of compounds and throughout compounds as a whole. In all cases, what is crucial for the operation of the phonological phenomena of compounds is phonological structure, in terms of constituents of the Prosodic Hierarchy, as opposed to morphosyntactic structure. Specifically, only two phonological constituents are required, the Phonological Word, which provides the domain for phenomena that apply to the individual members of compounds and at their junctures, and a larger constituent that groups the members of compounds together. The nature of the latter is somewhat controversial, the main issue being whether or not there is a constituent in the Prosodic Hierarchy between the Phonological Word and the Phonological Phrase. When present, this constituent, the Composite Group (revised from the original Clitic Group), includes the members of compounds, as well as “stray” elements such as clitics and “Level 2” affixes. In its absence, compounds, and often the same “stray” elements, are analyzed as a type of Recursive Phonological Word, although crucially, the combinations of such element do not exhibit the same properties as the basic Phonological Word.
Article
The Playful Lexicon in the Romance Languages: Prosodic Templates, Onomatopoeia, Reduplication, Clipping, Blending
David Pharies
A lexical item is described as “playful” or “ludic” when it shows evidence of manipulation of the relation that inheres between its form (signifier) and its meaning (signified). The playful lexicon of any given language, therefore, is the sum total of its lexical items that show signs of such manipulation. Linguists have long recognized that the only necessary link between a word’s form and its meaning is the arbitrary social convention that binds them. However, nothing prevents speakers from creating additional, unnecessary and therefore essentially “playful” links, associating forms with meanings in a symbolic, hence non-arbitrary way. This semantic effect is most evident in the case of onomatopoeia, through which the phonetic form of words that designate sounds is designed to be conventionally imitative of the sound. A second group of playful words combines repeated sequences of sounds with meanings that are themselves suggestive of repetition or related concepts such as collectivity, continuity, or actions in sequence, as well as repeated, back-and-forth, or uncontrolled movements, or even, more abstractly, intensity and hesitation. The playfulness of truncated forms such as clips and blends is based on a still more abstract connection between forms and meanings. In the case of clipping, the truncation of the full form of a word triggers a corresponding connotative truncation or diminution of the meaning, that is, a suggestion that the referent is small—either endearingly, humorously, or contemptuously so. In blending, truncation is often accompanied by overlapping, which symbolically highlights the interrelatedness or juxtaposition of the constituents’ individual meanings. Prosodic templates do not constitute a separate category per se; instead, they may play a part in the formation or alteration of words in any of the other categories discussed here.
Article
Polarity in the Semantics of Natural Language
Anastasia Giannakidou
This paper provides an overview of polarity phenomena in human languages. There are three prominent paradigms of polarity items: negative polarity items (NPIs), positive polarity items (PPIs), and free choice items (FCIs). What they all have in common is that they have limited distribution: they cannot occur just anywhere, but only inside the scope of licenser, which is negation and more broadly a nonveridical licenser, PPIs, conversely, must appear outside the scope of negation. The need to be in the scope of a licenser creates a semantic and syntactic dependency, as the polarity item must be c-commanded by the licenser at some syntactic level. Polarity, therefore, is a true interface phenomenon and raises the question of well-formedness that depends on both semantics and syntax.
Nonveridical polarity contexts can be negative, but also non-monotonic such as modal contexts, questions, other non-assertive contexts (imperatives, subjunctives), generic and habitual sentences, and disjunction. Some NPIs and FCIs appear freely in these contexts in many languages, and some NPIs prefer negative contexts. Within negative licensers, we make a distinction between classically and minimally negative contexts. There are no NPIs that appear only in minimally negative contexts.
The distributions of NPIs and FCIs crosslinguistically can be understood in terms of general patterns, and there are individual differences due largely to the lexical semantic content of the polarity item paradigms. Three general patterns can be identified as possible lexical sources of polarity. The first is the presence of a dependent variable in the polarity item—a property characterizing NPIs and FCIs in many languages, including Greek, Mandarin, and Korean. Secondly, the polarity item may be scalar: English any and FCIs can be scalar, but Greek, Korean, and Mandarin NPIs are not. Finally, it has been proposed that NPIs can be exhaustive, but exhaustivity is hard to precisely identify in a non-stipulative way, and does not characterize all NPIs. NPIs that are not exhaustive tend to be referentially vague, which means that the speaker uses them only if she is unable to identify a specific referent for them.
Article
Polysemy
Agustín Vicente and Ingrid L. Falkum
Polysemy is characterized as the phenomenon whereby a single word form is associated with two or several related senses. It is distinguished from monosemy, where one word form is associated with a single meaning, and homonymy, where a single word form is associated with two or several unrelated meanings. Although the distinctions between polysemy, monosemy, and homonymy may seem clear at an intuitive level, they have proven difficult to draw in practice.
Polysemy proliferates in natural language: Virtually every word is polysemous to some extent. Still, the phenomenon has been largely ignored in the mainstream linguistics literature and in related disciplines such as philosophy of language. However, polysemy is a topic of relevance to linguistic and philosophical debates regarding lexical meaning representation, compositional semantics, and the semantics–pragmatics divide.
Early accounts treated polysemy in terms of sense enumeration: each sense of a polysemous expression is represented individually in the lexicon, such that polysemy and homonymy were treated on a par. This approach has been strongly criticized on both theoretical and empirical grounds. Since at least the 1990s, most researchers converge on the hypothesis that the senses of at least many polysemous expressions derive from a single meaning representation, though the status of this representation is a matter of vivid debate: Are the lexical representations of polysemous expressions informationally poor and underspecified with respect to their different senses? Or do they have to be informationally rich in order to store and be able to generate all these polysemous senses?
Alternatively, senses might be computed from a literal, primary meaning via semantic or pragmatic mechanisms such as coercion, modulation or ad hoc concept construction (including metaphorical and metonymic extension), mechanisms that apparently play a role also in explaining how polysemy arises and is implicated in lexical semantic change.
Article
Psycholinguistic Methods and Tasks in Morphology
Daniel Schmidtke and Victor Kuperman
Lexical representations in an individual mind are not given to direct scrutiny. Thus, in their theorizing of mental representations, researchers must rely on observable and measurable outcomes of language processing, that is, perception, production, storage, access, and retrieval of lexical information. Morphological research pursues these questions utilizing the full arsenal of analytical tools and experimental techniques that are at the disposal of psycholinguistics. This article outlines the most popular approaches, and aims to provide, for each technique, a brief overview of its procedure in experimental practice. Additionally, the article describes the link between the processing effect(s) that the tool can elicit and the representational phenomena that it may shed light on. The article discusses methods of morphological research in the two major human linguistic faculties—production and comprehension—and provides a separate treatment of spoken, written and sign language.
Article
Raciolinguistics
Jennifer Phuong, María Cioè-Peña, and Arianna Chinchilla
Raciolinguistics, or the study of language in relation to race, is an emergent field primarily stemming from U.S. academia and centering critical theories, including educational and applied linguistics. There currently exists a debate as to whether the theories that undergird raciolinguistics should be the grounding for a field (i.e., raciolinguistics) or a theoretical underpinning (i.e., raciolinguistic perspectives/ideologies) of applied linguistics and sociolinguistics, particularly works that are rooted in the embodied experiences of racialized people. H. Samy Alim, John R. Rickford, and Arnetha F. Ball edited a volume that brought together scholars whose works address the intersection of race and language to consider raciolinguistics as a field. Still, others believe it is necessary to understand phenomena that go beyond named languages while still rooted in hierarchical conceptualizations of race. As such, Nelson Flores and Jonathan Rosa have introduced and continue to build on a raciolinguistic perspective by rooting contemporary phenomena in colonial histories. Using this lens, they position language evaluations and assessments of racialized people as extensions of colonial racial projects rooted in dehumanization and commodification. Since then, scholars from multiple fields have engaged with raciolinguistics and raciolinguistic ideologies to explore language and race using a variety of methods (e.g., discourse analysis, mixed methods), contexts (e.g., diverse places and participants), scales (e.g., policy, interpersonal interactions), and institutions (e.g., healthcare, education). Regardless of the specific framing of raciolinguistics, the field and perspective both foreground racial and linguistic justice.
Article
Scrambling in Korean Syntax
Heejeong Ko
Scrambling is one of the most widely discussed and prominent factors affecting word order variation in Korean. Scrambling in Korean exhibits various syntactic and semantic properties that cannot be subsumed under the standard A/A'-movement. Clause-external scrambling as well as clause-internal scrambling in Korean show mixed A/A'-effects in a range of tests such as anaphor binding, weak crossover, Condition C, negative polarity item licensing, wh-licensing, and scopal interpretation. VP-internal scrambling, by contrast, is known to be lack of reconstruction effects conforming to the claim that short scrambling is A-movement. Clausal scrambling, on the other hand, shows total reconstructions effects, unlike phrasal scrambling. The diverse properties of Korean scrambling have received extensive attention in the literature. Some studies argue that scrambling is a type of feature-driven A-movement with special reconstruction effects. Others argue that scrambling can be A-movement or A'-movement depending on the landing site. Yet others claim that scrambling is not standard A/A'-movement, but must be treated as cost-free movement with optional reconstruction effects. Each approach, however, faces non-trivial empirical and theoretical challenges, and further study is needed to understand the complex nature of scrambling. As the theory develops in the Minimalist Program, a variety of proposals have also been advanced to capture properties of scrambling without resorting to A/A'-distinctions.
Scrambling in Korean applies optionally but not randomly. It may be blocked due to various factors in syntax and its interfaces in the grammar. At the syntax proper, scrambling obeys general constraints on movement (e.g., island conditions, left branch condition, coordinate structure condition, proper binding condition, ban on string vacuous movement). Various semantic and pragmatic factors (e.g., specificity, presuppositionality, topic, focus) also play a crucial role in acceptability of sentences with scrambling. Moreover, current studies show that certain instances of scrambling are filtered out at the interface due to cyclic Spell-out and linearization, which strengthens the claim that scrambling is not a free option. Data from Korean pose important challenges against base-generation approaches to scrambling, and lend further credence to the view that scrambling is an instance of movement. The exact nature of scrambling in Korean—whether it is cost-free or feature-driven—must be further investigated in future research, however. The research on Korean scrambling leads us to the pursuit of a general theory, which covers obligatory A/A'-movement as well as optional displacement with mixed semantic effects in languages with free word order.
Article
The Semantics of Chinese Noun Phrases
Xuping Li
Chinese nominal phrases are typologically distinct from their English counterparts in many aspects. Most strikingly, Chinese is featured with a general classifier system, which not only helps to categorize nouns but also has to do with the issue of quantification. Moreover, it has neither noncontroversial plural markers nor (in)definite markers. Its bare nouns are allowed in various argument positions. As a consequence, Chinese is sometimes characterized as a classifier language, as an argumental language, or as an article-less language. One of the questions arising is whether these apparently different but related properties underscore a single issue: that it is the semantics of nouns that is responsible for all these peculiarities of Mandarin nominal phrases. It has been claimed that Chinese nouns are born as kind terms, from which the object-level readings can be derived, being either existential or definite. Nevertheless, the existence of classifiers in Chinese is claimed to be independent of the kind denotation of its bare nouns.
Within the general area of noun semantics, a number of other semantic issues have generated much interest. One is concerned with the availability of the mass/count distinction in Mandarin nominal phrases. Another issue has to do with the semantics of classifiers. Are classifiers required by the noun semantics or the numeral semantics, when occurring in the syntactic context of Numeral/Quantifier-Classifier-Noun? Finally, how is the semantic notion of definiteness understood in article-less languages like Mandarin Chinese? Should its denotation be characterized with uniqueness or familiarity?
Article
Semantic Theories of Questions
Floris Roelofsen
This survey article discusses two basic issues that semantic theories of questions face. The first is how to conceptualize and formally represent the semantic content of questions. This issue arises in particular because the standard truth-conditional notion of meaning, which has been fruitful in the analysis of declarative statements, is not applicable to questions. This is because questions are not naturally construed as being true or false. Instead, it has been proposed that the semantic content of a question must be characterized in terms of its answerhood or resolution conditions. This article surveys a number of theories which develop this basic idea in different ways, focusing on so-called proposition-set theories (alternative semantics, partition semantics, and inquisitive semantics).
The second issue that will be considered here concerns questions that are embedded within larger sentences. Within this domain, one important puzzle is why certain predicates can take both declarative and interrogative complements (e.g., Bill knows that Mary called / Bill knows who called), while others take only declarative complements (e.g., Bill thinks that Mary called / *Bill thinks who called) or only interrogative complements (e.g., Bill wonders who called / *Bill wonders that Mary called). We compare two general approaches that have been pursued in the literature. One assumes that declarative and interrogative complements differ in semantic type. On this approach, the fact that predicates like think do not take interrogative complements can be accounted for by assuming that such complements do not have the semantic type that think selects for. The other approach treats the two kinds of complement as having the same semantic type, and seeks to connect the selectional restrictions of predicates like think to other semantic properties (e.g., the fact that think is neg-raising).
Article
Special Language Domain in Which Grammatical Rules May Be Violated Legitimately in Chinese
Jie Xu and Yewei Qin
“Special language domain” (SLD) refers to domains or areas of language use in which linguistic rules may be violated legitimately. The SLD is similar to “free trade zones,” “special administrative regions,” and “special economic zones” in which tariff, executive, and economic regulations may be legitimately violated to an extent. Innovative use in SLD is another major resource for language evolution and language change as well as language contact and language acquisition, since some temporary and innovative forms of usage in SLD may develop beyond the SLD at a later stage to become part of the core system of linguistic rules. Focusing on relevant grammatical phenomena observed in the Chinese language, poetry in various forms, titles and slogans, and Internet language are the three major types of SLD, and their violation of linguistic rules is motivated differently. Furthermore, although core linguistic rules may be violated in SLD, the violations are still subject to certain limits and restrictions. Only some language-particular rules can be violated legitimately in SLD; the principles of Universal Grammar, applicable generally for all human languages, have to be observed even in the SLD. The study of a special language domain provides an ideal and fascinating window for linguists to understand language mechanisms, explain historical change in language, and plausibly predict the future direction of language evolution.
Article
The Status of the Morpheme
Tom Leu
The morpheme was the central notion in morphological theorizing in the 20th century. It has a very intuitive appeal as the indivisible and invariant unit of form and meaning, a minimal linguistic sign. Ideally, that would be all there is to build words and sentences from. But this ideal does not appear to be entirely adequate. At least at a perhaps superficial understanding of form as a series of phonemes, and of meaning as concepts and morphosyntactic feature sets, the form and the meaning side of words are often not structured isomorphically. Different analytical reactions are possible to deal with the empirical challenges resulting from the various kinds of non-isomorphism between form and meaning. One prominent option is to reject the morpheme and to recognize conceptually larger units such as the word or the lexeme and its paradigm as the operands of morphological theory. This contrasts with various theoretical options maintaining the morpheme, terminologically or at least conceptually at some level. One such option is to maintain the morpheme as a minimal unit of form, relaxing the tension imposed by the meaning requirement. Another option is to maintain it as a minimal morphosyntactic unit, relaxing the requirements on the form side. The latter (and to a lesser extent also the former) has been understood in various profoundly different ways: association of one morpheme with several form variants, association of a morpheme with non-self-sufficient phonological units, or association of a morpheme with a formal process distinct from affixation. Variants of all of these possibilities have been entertained and have established distinct schools of thought. The overall architecture of the grammar, in particular the way that the morphology integrates with the syntax and the phonology, has become a driving force in the debate. If there are morpheme-sized units, are they pre-syntactic or post-syntactic units? Is the association between meaning and phonological information pre-syntactic or post-syntactic? Do morpheme-sized pieces have a specific status in the syntax? Invoking some of the main issues involved, this article draws a profile of the debate, following the term morpheme on a by-and-large chronological path from the late 19th century to the 21st century.
Article
The Status of Heads in Morphology
Beata Moskal and Peter W. Smith
Headedness is a pervasive phenomenon throughout different components of the grammar, which fundamentally encodes an asymmetry between two or more items, such that one is in some sense more important than the other(s). In phonology for instance, the nucleus is the head of the syllable, and not the onset or the coda, whereas in syntax, the verb is the head of a verb phrase, rather than any complements or specifiers that it combines with. It makes sense, then, to question whether the notion of headedness applies to the morphology as well; specifically, do words—complex or simplex—have heads that determine the properties of the word as a whole? Intuitively it makes sense that words have heads: a noun that is derived from an adjective like redness can function only as a noun, and the presence of red in the structure does not confer on the whole form the ability to function as an adjective as well.
However, this question is a complex one for a variety of reasons. While it seems clear for some phenomena such as category determination that words have heads, there is a lot of evidence to suggest that the properties of complex words are not all derived from one morpheme, but rather that the features are gathered from potentially numerous morphemes within the same word. Furthermore, properties that characterize heads compared to dependents, particularly based on syntactic behavior, do not unambigously pick out a single element, but the tests applied to morphology at times pick out affixes, and at times pick out bases as the head of the whole word.
Article
Structural Semantics in the Romance Languages
Miguel Casas Gómez and Martin Hummel
Structural semantics is a primarily European structural linguistic approach to the content level of language which basically derives from two historical sources. The main inspiration stems from Ferdinand de Saussure’s Cours de linguistique générale (1916), where the Genevan linguist also formulates the fundamental principles of semantic analysis: the twofold character of the linguistic sign, the inner determination of its content by the—allegedly autonomous—linguistic system, the consequent exclusion of the extralinguistic reality, the notion of opposition inside the system, and the concept of “associative relations” in the domain of semantics. This tradition was later refined by Hjelmslev and Coseriu, who introduced theoretical and methodological strength and rigor, suggesting systematic analyses in terms of semantic features linked by (binary) opposition. The second source of inspiration was the more holistic concept elaborated by Wilhelm von Humboldt, who saw language as a means of structuring the world. In the second half of the 20th century, structural semantics was mainstream semantics (to the extent that semantic analysis was accepted at all). A long series of authors deepened these historical traditions in theoretical and empirical studies, some of them suggesting secondary and/or partial models. Finally, prototype semantics and cognitive semantics strove to downgrade structural semantics by turning back to a more holistic conception of meaning including the speakers’ knowledge of the world, although not without introducing the alternative structural notion of “network.”
Article
Subtraction in Morphology
Stela Manova
Subtraction consists in shortening the shape of the word. It operates on morphological bases such as roots, stems, and words in word-formation and inflection. Cognitively, subtraction is the opposite of affixation, since the latter adds meaning and form (an overt affix) to roots, stems, or words, while the former adds meaning through subtraction of form. As subtraction and affixation work at the same level of grammar (morphology), they sometimes compete for the expression of the same semantics in the same language, for example, the pattern ‘science—scientist’ in German has derivations such as Physik ‘physics’—Physik-er ‘physicist’ and Astronom-ie ‘astronomy’—Astronom ‘astronomer’. Subtraction can delete phonemes and morphemes. In case of phoneme deletion, it is usually the final phoneme of a morphological base that is deleted and sometimes that phoneme can coincide with a morpheme.
Some analyses of subtraction(-like shortenings) rely not on morphological units (roots, stems, morphological words, affixes) but on the phonological word, which sometimes results in alternative definitions of subtraction. Additionally, syntax-based theories of morphology that do not recognize a morphological component of grammar and operate only with additive syntactic rules claim that subtraction actually consists in addition of defective phonological material that causes adjustments in phonology and leads to deletion of form on the surface. Other scholars postulate subtraction only if the deleted material does not coincide with an existing morpheme elsewhere in the language and if it does, they call the change backformation. There is also some controversy regarding what is a proper word-formation process and whether what is derived by subtraction is true word-formation or just marginal or extragrammatical morphology; that is, the question is whether shortenings such as hypocoristics and clippings should be treated on par with derivations such as, for example, the pattern of science-scientist.
Finally, research in subtraction also faces terminology issues in the sense that in the literature different labels have been used to refer to subtraction(-like) formations: minus feature, minus formation, disfixation, subtractive morph, (subtractive) truncation, backformation, or just shortening.
Article
Syntactic Cartography
Ur Shlonsky and Giuliano Bocci
Syntactic cartography emerged in the 1990s as a result of the growing consensus in the field about the central role played by functional elements and by morphosyntactic features in syntax. The declared aim of this research direction is to draw maps of the structures of syntactic constituents, characterize their functional structure, and study the array and hierarchy of syntactically relevant features. Syntactic cartography has made significant empirical discoveries, and its methodology has been very influential in research in comparative syntax and morphosyntax. A central theme in current cartographic research concerns the source of the emerging featural/structural hierarchies. The idea that the functional hierarchy is not a primitive of Universal Grammar but derives from other principles does not undermine the scientific relevance of the study of the cartographic structures. On the contrary, the cartographic research aims at providing empirical evidence that may help answer these questions about the source of the hierarchy and shed light on how the computational principles and requirements of the interface with sound and meaning interact.
Article
Syntactic Categorization of Roots
Terje Lohndal
A root is a fundamental minimal unit in words. Some languages do not allow their roots to appear on their own, as in the Semitic languages where roots consist of consonant clusters that become stems or words by virtue of vowel insertion. Other languages appear to allow roots to surface without any additional morphology, as in English car. Roots are typically distinguished from affixes in that affixes need a host, although this varies within different theories.
Traditionally roots have belonged to the domain of morphology. More recently, though, new theories have emerged according to which words are decomposed and subject to the same principles as sentences. That makes roots a fundamental building block of sentences, unlike words. Contemporary syntactic theories of roots hold that they have little if any grammatical information, which raises the question of how they acquire their seemingly grammatical properties. A central issue has revolved around whether roots have a lexical category inherently or whether they are given a lexical category in some other way. Two main theories are distributed morphology and the exoskeletal approach to grammar. The former holds that roots merge with categorizers in the grammar: a root combined with a nominal categorizer becomes a noun, and a root combined with a verbal categorizer becomes a verb. On the latter approach, it is argued that roots are inserted into syntactic structures which carry the relevant category, meaning that the syntactic environment is created before roots are inserted into the structure. The two views make different predictions and differ in particular in their view of the status of empty categorizers.
Article
Syntactic Typology
Masayoshi Shibatani
The major achievements in syntactic typology garnered nearly 50 years ago by acclaimed typologists such as Edward Keenan and Bernard Comrie continue to exert enormous influence in the field, deserving periodic appraisals in the light of new discoveries and insights. With an increased understanding of them in recent years, typologically controversial ergative and Philippine-type languages provide a unique opportunity to reassess the issues surrounding the delicately intertwined topics of grammatical relations and relative clauses (RCs), perhaps the two foremost topics in syntactic typology.
Keenan’s property-list approach to the grammatical relation subject brings wrong results for ergative and Philippine-type languages, both of which have at their disposal two primary grammatical relations of subject and absolutive in the former and of subject and topic in the latter. Ergative languages are characterized by their deployment of arguments according to both the nominative (S=A≠P) and the ergative (S=P≠A) pattern. Phenomena such as nominal morphology and relativization are typically controlled by the absolutive relation, defined as a union of {S, P} resulting from a P-based generalization. Other phenomena such as the second person imperative deletion and a gap control in compound (coordinate) sentences involve as a pivot the subject relation, defined as an {S, A} grouping resulting from an A-based generalization. Ergative languages, thus, clearly demonstrate that grammatical relations are phenomenon/construction specific. Philippine-type languages reinforce this point by their possession of subjects, as defined above, and a pragmatico-syntactic relation of topic correlated with the referential prominence of a noun phrase (NP) argument. As in ergative languages, certain phenomena, for example, controlling of a gap in the want-type control construction, operate in terms of the subject, while others, for example, relativization, revolve around the topic.
With regard to RCs, the points made above bear directly on the claim by Keenan and Comrie that subjects are universally the most relativizable of NP’s, justifying the high end of the Noun Phrase Accessibility Hierarchy. A new nominalization perspective on relative clauses reveals that grammatical relations are actually irrelevant to the relativization process per se, and that the widely embraced typology of RCs, recognizing so-called headless and internally headed RCs and others as construction types, is misguided in that RCs in fact do not exist as independent grammatical structures; they are merely epiphenomenal to the usage patterns of two types of grammatical nominalizations.
The so-called subject relativization (e.g., You should marry a man
who loves you
) involves a head noun and a subject argument nominalization (e.g., [who [Ø loves you]]) that are joined together forming a larger NP constituent in the manner similar to the way a head noun and an adjectival modifier are brought together in a simple attributive construction (e.g., a rich man) with no regard to grammatical relations. The same argument nominalization can head an NP (e.g., You should marry who loves you
). This is known as a headless RC, while it is in fact no more than an NP use of an argument nominalization, as opposed to the modification use of the same structure in the ordinary restrictive RC seen above. So-called internally headed RCs involve event nominalizations (e.g., Quechua
Maria wallpa-ta wayk’u-sqa-n
-ta mik”u-sayku [Maria chicken-acc cook-P.nmlzr-3sg-acc eat-prog.1pl], lit. “We are eating Maria cook a chicken,” and English I heard
John sing in the kitchen
) that evoke various substantive entities metonymically related to the event, such as event protagonists (as in the Quechua example), results (as in the English example), and abstract entities such as facts and propositions (e.g., I know that John sings in the kitchen
).
Article
Syntax of Ditransitives
Heidi Harley and Shigeru Miyagawa
Ditransitive predicates select for two internal arguments, and hence minimally entail the participation of three entities in the event described by the verb. Canonical ditransitive verbs include give, show, and teach; in each case, the verb requires an agent (a giver, shower, or teacher, respectively), a theme (the thing given, shown, or taught), and a goal (the recipient, viewer, or student). The property of requiring two internal arguments makes ditransitive verbs syntactically unique. Selection in generative grammar is often modeled as syntactic sisterhood, so ditransitive verbs immediately raise the question of whether a verb may have two sisters, requiring a ternary-branching structure, or whether one of the two internal arguments is not in a sisterhood relation with the verb.
Another important property of English ditransitive constructions is the two syntactic structures associated with them. In the so-called “double object construction,” or DOC, the goal and theme both are simple NPs and appear following the verb in the order V-goal-theme. In the “dative construction,” the goal is a PP rather than an NP and follows the theme in the order V-theme-to goal. Many ditransitive verbs allow both structures (e.g., give John a book/give a book to John). Some verbs are restricted to appear only in one or the other (e.g. demonstrate a technique to the class/*demonstrate the class a technique; cost John $20/*cost $20 to John). For verbs which allow both structures, there can be slightly different interpretations available for each. Crosslinguistic results reveal that the underlying structural distinctions and their interpretive correlates are pervasive, even in the face of significant surface differences between languages. The detailed analysis of these questions has led to considerable progress in generative syntax. For example, the discovery of the hierarchical relationship between the first and second arguments of a ditransitive has been key in motivating the adoption of binary branching and the vP hypothesis. Many outstanding questions remain, however, and the syntactic encoding of ditransitivity continues to inform the development of grammatical theory.