Alan Reed Libert
Artificial languages—languages which have been consciously designed—have been created for more than 900 years, although the number of them has increased considerably in recent decades, and by the early 21st century the total figure probably was in the thousands. There have been several goals behind their creation; the traditional one (which applies to some of the best-known artificial languages, including Esperanto) is to make international communication easier. Some other well-known artificial languages, such as Klingon, have been designed in connection with works of fiction. Still others are simply personal projects.
A traditional way of classifying artificial languages involves the extent to which they make use of material from natural languages. Those artificial languages which are created mainly by taking material from one or more natural languages are called a posteriori languages (which again include well-known languages such as Esperanto), while those which do not use natural languages as sources are a priori languages (although many a posteriori languages have a limited amount of a priori material, and some a priori languages have a small number of a posteriori components). Between these two extremes are the mixed languages, which have large amounts of both a priori and a posteriori material. Artificial languages can also be classified typologically (as natural languages are) and by how and how much they have been used.
Many linguists seem to be biased against research on artificial languages, although some major linguists of the past have been interested in them.
Blocking can be defined as the non-occurrence of some linguistic form, whose existence could be expected on general grounds, due to the existence of a rival form. *Oxes, for example, is blocked by oxen, *stealer by thief. Although blocking is closely associated with morphology, in reality the competing “forms” can not only be morphemes or words, but can also be syntactic units. In German, for example, the compound Rotwein ‘red wine’ blocks the phrasal unit *roter Wein (in the relevant sense), just as the phrasal unit rote Rübe ‘beetroot; lit. red beet’ blocks the compound *Rotrübe. In these examples, one crucial factor determining blocking is synonymy; speakers apparently have a deep-rooted presumption against synonyms. Whether homonymy can also lead to a similar avoidance strategy, is still controversial. But even if homonymy blocking exists, it certainly is much less systematic than synonymy blocking.
In all the examples mentioned above, it is a word stored in the mental lexicon that blocks a rival formation. However, besides such cases of lexical blocking, one can observe blocking among productive patterns. Dutch has three suffixes for deriving agent nouns from verbal bases, -er, -der, and -aar. Of these three suffixes, the first one is the default choice, while -der and -aar are chosen in very specific phonological environments: as Geert Booij describes in The Morphology of Dutch (2002), “the suffix -aar occurs after stems ending in a coronal sonorant consonant preceded by schwa, and -der occurs after stems ending in /r/” (p. 122). Contrary to lexical blocking, the effect of this kind of pattern blocking does not depend on words stored in the mental lexicon and their token frequency but on abstract features (in the case at hand, phonological features).
Blocking was first recognized by the Indian grammarian Pāṇini in the 5th or 4th century
John E. Joseph
Ferdinand de Saussure (1857–1913), the founding figure of modern linguistics, made his mark on the field with a book he published a month after his 21st birthday, in which he proposed a radical rethinking of the original system of vowels in Proto-Indo-European. A year later, he submitted his doctoral thesis on a morpho-syntactic topic, the genitive absolute in Sanskrit, to the University of Leipzig. He went to Paris intending to do a second, French doctorate, but instead he was given responsibility for courses on Gothic and Old High Gerrman at the École Pratique des Hautes Études, and for managing the publications of the Société de Linguistique de Paris. He abandoned more than one large publication project of his own during the decade he spent in Paris. In 1891 he returned to his native Geneva, where the University created a chair in Sanskrit and the history and comparison of languages for him. He produced some significant work on Lithuanian during this period, connected to his early book on the Indo-European vowel system, and yielding Saussure’s Law, concerning the placement of stress in Lithuanian. He undertook writing projects about the general nature of language, but again abandoned them. In 1907, 1908–1909, and 1910–1911, he gave three courses in general linguistics at the University of Geneva, in which he developed an approach to languages as systems of signs, each sign consisting of a signifier (sound pattern) and a signified (concept), both of them mental rather than physical in nature, and conjoined arbitrarily and inseparably. The socially shared language system, or langue, makes possible the production and comprehension of parole, utterances, by individual speakers and hearers. Each signifier and signified is a value generated by its difference from all the other signifiers or signifieds with which it coexists on an associative (or paradigmatic) axis, and affected as well by its syntagmatic axis. Shortly after Saussure’s death at 55, two of his colleagues, Bally and Sechehaye, gathered together students’ notes from the three courses, as well as manuscript notes by Saussure, and from them constructed the Cours de linguistique générale, published in 1916. Over the course of the next several decades, this book became the basis for the structuralist approach, initially within linguistics, and later adapted to other fields. Saussure left behind a large quantity of manuscript material that has gradually been published over the last few decades, and continues to be published, shedding new light on his thought.
The German sinologist and general linguist Georg von der Gabelentz (1840–1893) occupies an interesting place at the intersection of several streams of linguistic scholarship at the end of the 19th century. As Professor of East Asian languages at the University of Leipzig from 1878 to 1889 and then Professor for Sinology and General Linguistics at the University of Berlin from 1889 until his death, Gabelentz was present at some of the main centers of linguistics at the time. He was, however, generally critical of mainstream historical-comparative linguistics as propagated by the neogrammarians, and instead emphasized approaches to language inspired by a line of researchers including Wilhelm von Humboldt (1767–1835), H. Steinthal (1823–1899), and his own father, Hans Conon von der Gabelentz (1807–1874).
Today Gabelentz is chiefly remembered for several theoretical and methodological innovations which continue to play a role in linguistics. Most significant among these are his contributions to cross-linguistic syntactic comparison and typology, grammar-writing, and grammaticalization. His earliest linguistic work emphasized the importance of syntax as a core part of grammar and sought to establish a framework for the cross-linguistic description of word order, as had already been attempted for morphology by other scholars. The importance he attached to syntax was motivated by his engagement with Classical Chinese, a language almost devoid of morphology and highly reliant on syntax. In describing this language in his 1881 Chinesische Grammatik, Gabelentz elaborated and implemented the complementary “analytic” and “synthetic” systems of grammar, an approach to grammar-writing that continues to serve as a point of reference up to the present day. In his summary of contemporary thought on the nature of grammatical change in language, he became one of the first linguists to formulate the principles of grammaticalization in essentially the form that this phenomenon is studied today, although he did not use the current term. One key term of modern linguistics that he did employ, however, is “typology,” a term that he in fact coined. Gabelentz’s typology was a development on various contemporary strands of thought, including his own comparative syntax, and is widely acknowledged as a direct precursor of the present-day field.
Gabelentz is a significant transitional figure from the 19th to the 20th century. On the one hand, his work seems very modern. Beyond his contributions to grammaticalization avant la lettre and his christening of typology, his conception of language prefigures the structuralist revolution of the early 20th century in important respects. On the other hand, he continues to entertain several preoccupations of the 19th century—in particular the judgment of the relative value of different languages—which were progressively banished from linguistics in the first decades of the 20th century.
The grammatization of European vernacular languages began in the Late Middle Ages and Renaissance and continued up until the end of the 18th century. Through this process, grammars were written for the vernaculars and, as a result, the vernaculars were able to establish themselves in important areas of communication. Vernacular grammars largely followed the example of those written for Latin, using Latin descriptive categories without fully adapting them to the vernaculars. In accord with the Greco-Latin tradition, the grammars typically contain sections on orthography, prosody, morphology, and syntax, with the most space devoted to the treatment of word classes in the section on “etymology.” The earliest grammars of vernaculars had two main goals: on the one hand, making the languages described accessible to non-native speakers, and on the other, supporting the learning of Latin grammar by teaching the grammar of speakers’ native languages. Initially, it was considered unnecessary to engage with the grammar of native languages for their own sake, since they were thought to be acquired spontaneously. Only gradually did a need for normative grammars develop which sought to codify languages. This development relied on an awareness of the value of vernaculars that attributed a certain degree of perfection to them. Grammars of indigenous languages in colonized areas were based on those of European languages and today offer information about the early state of those languages, and are indeed sometimes the only sources for now extinct languages. Grammars of vernaculars came into being in the contrasting contexts of general grammar and the grammars of individual languages, between grammar as science and as art and between description and standardization. In the standardization of languages, the guiding principle could either be that of anomaly, which took a particular variety of a language as the basis of the description, or that of analogy, which permitted interventions into a language aimed at making it more uniform.
Ans van Kemenade
The status of English in the early 21st century makes it hard to imagine that the language started out as an assortment of North Sea Germanic dialects spoken in parts of England only by immigrants from the continent. Itself soon under threat, first from the language(s) spoken by Viking invaders, then from French as spoken by the Norman conquerors, English continued to thrive as an essentially West-Germanic language that did, however, undergo some profound changes resulting from contact with Scandinavian and French. A further decisive period of change is the late Middle Ages, which started a tremendous societal scale-up that triggered pervasive multilingualism. These repeated layers of contact between different populations, first locally, then nationally, followed by standardization and 18th-century codification, metamorphosed English into a language closely related to, yet quite distinct from, its closest relatives Dutch and German in nearly all language domains, not least in word order, grammar, and pronunciation.
Irit Meir and Oksana Tkachman
Iconicity is a relationship of resemblance or similarity between the two aspects of a sign: its form and its meaning. An iconic sign is one whose form resembles its meaning in some way. The opposite of iconicity is arbitrariness. In an arbitrary sign, the association between form and meaning is based solely on convention; there is nothing in the form of the sign that resembles aspects of its meaning. The Hindu-Arabic numerals 1, 2, 3 are arbitrary, because their current form does not correlate to any aspect of their meaning. In contrast, the Roman numerals I, II, III are iconic, because the number of occurrences of the sign I correlates with the quantity that the numerals represent. Because iconicity has to do with the properties of signs in general and not only those of linguistic signs, it plays an important role in the field of semiotics—the study of signs and signaling. However, language is the most pervasive symbolic communicative system used by humans, and the notion of iconicity plays an important role in characterizing the linguistic sign and linguistic systems. Iconicity is also central to the study of literary uses of language, such as prose and poetry.
There are various types of iconicity: the form of a sign may resemble aspects of its meaning in several ways: it may create a mental image of the concept (imagic iconicity), or its structure and the arrangement of its elements may resemble the structural relationship between components of the concept represented (diagrammatic iconicity). An example of the first type is the word cuckoo, whose sounds resemble the call of the bird, or a sign such as RABBIT in Israeli Sign Language, whose form—the hands representing the rabbit's long ears—resembles a visual property of that animal. An example of diagrammatic iconicity is vēnī, vīdī, vīcī, where the order of clauses in a discourse is understood as reflecting the sequence of events in the world.
Iconicity is found on all linguistic levels: phonology, morphology, syntax, semantics, and discourse. It is found both in spoken languages and in sign languages. However, sign languages, because of the visual-gestural modality through which they are transmitted, are much richer in iconic devices, and therefore offer a rich array of topics and perspectives for investigating iconicity, and the interaction between iconicity and language structure.
During the period from the fall of the Roman empire in the late 5th century to the beginning of the European Renaissance in the 14th century, the development of linguistic thought in Europe was characterized by the enthusiastic study of grammatical works by Classical and Late Antique authors, as well as by the adaptation of these works to suit a Christian framework. The discipline of grammatica, viewed as the cornerstone of the ideal liberal arts education and as a key to the wider realm of textual culture, was understood to encompass both the systematic principles for speaking and writing correctly and the science of interpreting the poets and other writers. The writings of Donatus and Priscian were among the most popular and well-known works of the grammatical curriculum, and were the subject of numerous commentaries throughout the medieval period. Although Latin persisted as the predominant medium of grammatical discourse, there is also evidence from as early as the 8th century for the enthusiastic study of vernacular languages and for the composition of vernacular-medium grammars, including sources pertaining to Anglo-Saxon, Irish, Old Norse, and Welsh. The study of language in the later medieval period is marked by experimentation with the form and layout of grammatical texts, including the composition of textbooks in verse form. This period also saw a renewed interest in the application of philosophical ideas to grammar, inspired in part by the availability of a wider corpus of Greek sources than had previously been unknown to western European scholars, such as Aristotle’s Physics, Metaphysics, Ethics, and De Anime. A further consequence of the renewed interest in the logical and metaphysical works of Aristotle during the later Middle Ages is the composition of so-called ‘speculative grammars’ written by scholars commonly referred to as the ‘Modistae’, in which the grammatical description of Latin formulated by Priscian and Donatus was integrated with the system of scholastic philosophy that was at its height from the beginning of the 13th to the middle of the 14th century.
Traditional Chinese linguistics grew out of two distinct interests in language: the philosophical reflection on things and their names, and the practical concern for literacy education and the correct understanding of classical works. The former is most typically found in the teachings of such pre-Qin masters as Confucius, Mozi, and Gongsun Long, who lived between the 6th and 3rd centuries
The picture just presented, in which Chinese philosophy and philology are combined to form a seemingly autonomous tradition, is complicated, however, by the fact that the Indic linguistic tradition started to influence the Chinese in the 2nd century
Chinese, with its linguistic tradition, had a profound impact in ancient East Asia. Not only did traditional studies of Japanese, Tangut, and other languages show significant Chinese influence, under which not the least achievement was the invention of the earliest writing systems for these languages, but many scholars from Japan and Korea actually took an active part in the study of Chinese as well, so that the Chinese linguistic tradition would itself be incomplete without the materials and findings these non-Chinese scholars have contributed. On the other hand, some of these scholars, most notably Motoori Norinaga and Fujitani Nariakira in Japan, were able to free themselves from the character-centered Chinese routine and develop rather original linguistic theories.
Indian linguistic thought begins around the 8th–6th centuries
The greater part of documented thought is related to Sanskrit (Ancient Indo-Aryan). Very early, the oral transmission of sacred texts—the Vedas, composed in Vedic Sanskrit—made it necessary to develop techniques based on a subtle analysis of language. The Vedas also—but presumably later—gave birth to bodies of knowledge dealing with language, which are traditionally called Vedāṅgas: phonetics (śikṣā), metrics (chandas), grammar (vyākaraṇa), and semantic explanation (nirvacana, nirukta). Later on, Vedic exegesis (mīmāṃsā), new dialectics (navya-nyāya), lexicography, and poetics (alaṃkāra) also contributed to linguistic thought.
Though languages other than Sanskrit were described in premodern India, the grammatical description of Sanskrit—given in Sanskrit—dominated and influenced them more or less strongly. Sanskrit grammar (vyākaraṇa) has a long history marked by several major steps (Padapāṭha versions of Vedic texts, Aṣṭādhyāyī of Pāṇini, Mahābhāṣya of Patañjali, Bhartṛhari’s works, Siddhāntakaumudī of Bhaṭṭoji Dīkṣita, Nāgeśa’s works), and the main topics it addresses (minimal meaning-bearer units, classes of words, relation between word and meaning/referent, the primary meaning/referent of nouns) are still central issues for contemporary linguistics.
Missionary dictionaries are printed books or manuscripts compiled by missionaries in which words are listed systematically followed by words which have the same meaning in another language. These dictionaries were mainly written as tools for language teaching and learning in a missionary-colonial setting, although quite a few dictionaries have also a more encyclopedic character, containing invaluable information on non-Western cultures from all continents. In this article, several types of dictionaries are analyzed: bilingual-monodirectional, bilingual and bidirectional, and multilingual. Most examples are taken from an illustrative selected corpus of missionary dictionaries describing non-Western and languages during the colonial period, with particular focus on the function of these dictionaries in a missionary context, the users, macrostructure, organizational principles, and the typology of the microstructure and markedness in lemmatization.
Missionary grammars are printed books or manuscripts compiled by missionaries in which a particular language is described. These grammars were mainly written as pedagogical tools for language teaching and learning in a missionary-colonial setting, although quite a few grammars have also a more normative character. Missionary grammars contain usually an opening section, a prologue, in which the author exhibits the objectives of his work. The first part is usually a short introduction into phonology and orthography, followed by the largest section, which is devoted to morphology, arranged according to the traditional division of the parts of speech. The final section is sometimes devoted to syntax, but the topics included can vary considerably. Sometimes word lists are appended, containing body parts, measures, counting, manners of speaking, or rhetorical figures. The data presented in the grammar are mainly based on an oral corpus, whereas in other cases high registers from prestigious texts are used in which the eloquence or elegance of the language under study is illustrated. These grammars are modeled according to the traditional Greco-Latin framework and often contain invaluable information regarding language typologies, semantics, and pragmatics. In the New World, Asia, and elsewhere, missionaries had to find an adequate methodology in order to describe typological features they had never seen before. They adapted European models to new linguistic realities and created original works which deserve our attention within the discipline of the history of linguistics alongside contemporary pedagogical works written in Europe. This article concentrates on sources written in Spanish, Portuguese, and Latin during the colonial period, since these sources outnumber the production of missionary grammars in other languages.
Computational models of human sentence comprehension help researchers reason about how grammar might actually be used in the understanding process. Taking a cognitivist approach, this article relates computational psycholinguistics to neighboring fields (such as linguistics), surveys important precedents, and catalogs open problems.
Edwin L. Battistella
Nikolai Trubetzkoy (1890–1938) was a Russian émigré scholar who settled in Austria in 1922, serving as Head of Slavic Linguistics at the University of Vienna and participating in the Prague Linguistics Circle. Trubetzkoy wrote nearly 150 works on phonology, prosody, comparative linguistics, linguistic geography, folklore, literature, history, and political theory. His posthumously published Grundzüge der Phonologie (Principles of Phonology) is regarded as one of the key works in the science of phonology. Here Trubetzkoy, influenced by Saussurean insights, elaborated on the linguistic function of speech sounds, the role of oppositions, and markedness. He was also concerned with developing universal laws of phonological patterning, and his work involves the discussion of a wide variety of languages. The Grundzüge became the classic statement of part of Prague School linguistics, which later influenced both European and American linguistics, notably in Chomsky and Halle’s The Sound Pattern of English. Less well-known are Trubetzkoy’s historical and political works on Eurasia and Eurasianism. In Europe and Mankind, Trubetzkoy argued that Russia was not culturally part of Europe but should evolve to form its own political systems based on its geography and common legacy with the peoples of Eurasia.
Howard Lasnik and Terje Lohndal
Noam Avram Chomsky is one of the central figures of modern linguistics. He was born in Philadelphia, Pennsylvania on December 7, 1928. In 1945, Chomsky enrolled in the University of Pennsylvania, where he met Zellig Harris (1909–1992), a leading Structuralist, through their shared political interests. His first encounter with Harris’s work was when he proof-read Harris’s book Methods in Structural Linguistics, published in 1951 but completed already in 1947. Chomsky grew dissatisfied with Structuralism and started to develop his own major idea that syntax and phonology are in part matters of abstract representations. This was soon combined with a psychobiological view of language as a unique part of the mind/brain.
Chomsky spent 1951–1955 as a Junior Fellow of the Harvard Society of Fellows, after which he joined the faculty at MIT under the sponsorship of Morris Halle. He was promoted to full professor of Foreign Languages and Linguistics in 1961, appointed Ferrari Ward Professor of Linguistics in 1966, and Institute Professor in 1976, retiring in 2002. Chomsky is still remarkably active, publishing, teaching, and lecturing across the world.
In 1967, both the University of Chicago and the University of London awarded him honorary degrees, and since then he has been the recipient of scores of honors and awards. In 1988, he was awarded the Kyoto Prize in basic science, created in 1984 in order to recognize work in areas not included among the Nobel Prizes. These honors are all a testimony to Chomsky’s influence and impact in linguistics and cognitive science more generally over the past 60 years. His contributions have of course also been heavily criticized, but nevertheless remain crucial to investigations of language.
Chomsky’s work has always centered around the same basic questions and assumptions, especially that human language is an inherent property of the human mind. The technical part of his research has continuously been revised and updated. In the 1960s phrase structure grammars were developed into what is known as the Standard Theory, which transformed into the Extended Standard Theory and X-bar theory in the 1970s. A major transition occurred at the end of the 1970s, when the Principles and Parameters Theory emerged. This theory provides a new understanding of the human language faculty, focusing on the invariant principles common to all human languages and the points of variation known as parameters. Its recent variant, the Minimalist Program, pushes the approach even further in asking why grammars are structured the way they are.
Matthew J. Gordon
William Labov (b. 1927) is an American linguist who pioneered the study of variationist sociolinguistics. Born and raised in northern New Jersey, Labov studied English and philosophy at Harvard University (BA, 1948) and worked as an industrial chemist for several years before entering graduate school in linguistics at Columbia University in 1961. He completed his PhD in 1964, under the direction of Uriel Weinreich. He worked at Columbia until 1971, when he joined the faculty of the University of Pennsylvania, where he taught until his retirement in 2014.
Labov’s influence on the field began with research he conducted in graduate school. His study of changing pronunciations on Martha’s Vineyard, the subject of his master’s thesis, introduced a method for observing sound change in progress and broke with tradition by exploring social motivations for linguistic innovations. For his PhD dissertation, Labov carried out a study of dialect patterns on the Lower East Side of New York City. Using a systematic, quantitative methodology, he demonstrated that linguistic variation is socially stratified, such that the use of pronunciation features (e.g., dropping of post-vocalic /r/) correlates with social class, ethnicity, etc. in regular patterns. Labov’s early research was greatly influential and inspired many scholars to carry out similar projects in other communities. The paradigm came to be known as variationist sociolinguistics.
Much of Labov’s scholarship seeks to advance our understanding of language change. Historical linguists traditionally study completed linguistic changes, often long after they occurred, but Labov developed a method for examining active changes through a quantitative comparison of speakers representing several generations. This approach produces a new perspective on the change process by revealing intermediate stages. Labov has brought insights from this research to bear on theoretical debates within historical linguistics and the field more broadly. His work in this area has also documented many active sound changes in American English. Among these changes are innovations underway in particular dialects, such as the vowel changes in Philadelphia, as well as broader regional patterns, such as the Northern Cities Shift heard in the Great Lakes states.
Throughout his career, social justice concerns have fueled Labov’s research. He has sought to demonstrate that the speech of stigmatized groups is as systematic and rule-governed as any other. He led a pioneering study in Harlem in the late 1960s that shone new light on African American English, demonstrating, for example, that grammatical usages like the deletion of the copula (e.g., He fast) are subject to regular constraints. Labov has served as an expert witness in court and before the U.S. Congress to share insights from his study of African American English. He has also worked to promote literacy for speakers of non-standard dialects, carrying out research on reading and developing material for the teaching of reading to these populations.