1-20 of 51 Results

Article

Zygmunt Frajzyngier

Afroasiatic languages are the fourth largest linguistic phylum, spoken by some 350 million people in North, West, Central, and East Africa, in the Middle East, and in scattered communities in Europe, the United States, and the Caucasus. Some Afroasiatic languages, such as Arabic, Hausa, Amharic, Somali, and Oromo, are spoken by millions of people, while others are endangered with extinction. As of the early 21st century, the phylum is composed of six families: Egyptian (extinct), Semitic, Cushitic, Omotic, Berber, and Chadic. There are some typological features shared by all families, particularly in the domain of phonology. Languages are also typologically quite distinct with respect to syntax and functions encoded in the grammatical systems. Some Afroasiatic languages, such as Egyptian, Akkadian, Phoenician, Hebrew, Arabic, and Ge’ez, have a longtime written tradition, but for many languages no writing system has yet been proposed or adopted. The Old Semitic writing system gave rise to the modern alphabets used in thousands of unrelated contemporary languages. Two Semitic languages, Hebrew (with some Aramaic) and Arabic, were used to write the Old Testament and the Koran, the holy books of Judaism and Islam.

Article

Susan Edwards and Christos Salis

Aphasia is an acquired language disorder subsequent to brain damage in the left hemisphere. It is characterized by diminished abilities to produce and understand both spoken and written language compared with the speaker’s presumed ability pre-cerebral damage. The type and severity of the aphasia depends not only on the location and extent of the cerebral damage but also the effect the lesion has on connecting areas of the brain. Type and severity of aphasia is diagnosed in comparison with assumed normal adult language. Language changes associated with normal aging are not classed as aphasia. The diagnosis and assessment of aphasia in children, which is unusual, takes account of age norms. The most common cause of aphasia is a cerebral vascular accident (CVA) commonly referred to as a stroke, but brain damage following traumatic head injury such as road accidents or gunshot wounds can also cause aphasia. Aphasia following such traumatic events is non-progressive in contrast to aphasia arising from brain tumor, some types of infection, or language disturbances in progressive conditions such as Alzheimer’s disease, where the language disturbance increases as the disease progresses. The diagnosis of primary progressive aphasia (as opposed to non-progressive aphasia, the main focus of this article) is based on the following inclusion and exclusion criteria by M. Marsel Mesulam, in 2001. Inclusion criteria are as follows: Difficulty with language that interferes with activities of daily living and aphasia is the most prominent symptom. Exclusion criteria are as follows: Other non-degenerative disease or medical disorder, psychiatric diagnosis, episodic memory, visual memory, and visuo-perceptual impairment, and, finally, initial behavioral disturbance. Aphasia involves one or more of the building blocks of language, phonemes, morphology, lexis, syntax, and semantics; and the deficits occur in various clusters or patterns across the spectrum. The degree of impairment varies across modalities, with written language often, but not always, more affected than spoken language. In some cases, understanding of language is relatively preserved, in others both production and understanding are affected. In addition to varied degrees of impairment in spoken and written language, any or more than one component of language can be affected. At the most severe end of the spectrum, a person with aphasia may be unable to communicate by either speech or writing and may be able to understand virtually nothing or only very limited social greetings. At the least severe end of the spectrum, the aphasic speaker may experience occasional word finding difficulties, often difficulties involving nouns; but unlike difficulties in recalling proper nouns in normal aging, word retrieval problems in mild aphasia includes other word classes. Descriptions of different clusters of language deficits have led to the notion of syndromes. Despite great variations in the condition, patterns of language deficits associated with different areas of brain damage have been influential in understanding language-brain relationships. Increasing sophistication in language assessment and neurological investigations are contributing to a greater, yet still incomplete understanding of language-brain relationships.

Article

Alan Reed Libert

Artificial languages—languages which have been consciously designed—have been created for more than 900 years, although the number of them has increased considerably in recent decades, and by the early 21st century the total figure probably was in the thousands. There have been several goals behind their creation; the traditional one (which applies to some of the best-known artificial languages, including Esperanto) is to make international communication easier. Some other well-known artificial languages, such as Klingon, have been designed in connection with works of fiction. Still others are simply personal projects. A traditional way of classifying artificial languages involves the extent to which they make use of material from natural languages. Those artificial languages which are created mainly by taking material from one or more natural languages are called a posteriori languages (which again include well-known languages such as Esperanto), while those which do not use natural languages as sources are a priori languages (although many a posteriori languages have a limited amount of a priori material, and some a priori languages have a small number of a posteriori components). Between these two extremes are the mixed languages, which have large amounts of both a priori and a posteriori material. Artificial languages can also be classified typologically (as natural languages are) and by how and how much they have been used. Many linguists seem to be biased against research on artificial languages, although some major linguists of the past have been interested in them.

Article

Bilingualism/multilingualism is a natural phenomenon worldwide. Unwittingly, however, monolingualism has been used as a standard to characterize and define bilingualism/multilingualism in linguistic research. Such a conception led to a “fractional,” “irregular,” and “distorted” view of bilingualism, which is becoming rapidly outmoded in the light of multipronged, rapidly growing interdisciplinary research. This article presents a complex and holistic view of bilinguals and multilinguals on conceptual, theoretical, and pragmatic/applied grounds. In that process, it attempts to explain why bilinguals are not a mere composite of two monolinguals. If bilinguals were a clone of two monolinguals, the study of bilingualism would not merit any substantive consideration in order to come to grips with bilingualism; all one would have to do is focus on the study of a monolingual person. Interestingly, even the two bilinguals are not clones of each other, let alone bilinguals as a set of two monolinguals. This paper examines the multiple worlds of bilinguals in terms of their social life and social interaction. The intricate problem of defining and describing bilinguals is addressed; their process and end result of becoming bilinguals is explored alongside their verbal interactions and language organization in the brain. The role of social and political bilingualism is also explored as it interacts with individual bilingualism and global bilingualism (e.g., the issue of language endangerment and language death). Other central concepts such as individuals’ bilingual language attitudes, language choices, and consequences are addressed, which set bilinguals apart from monolinguals. Language acquisition is as much an innate, biological, as social phenomenon; these two complementary dimensions receive consideration in this article along with the educational issues of school performance by bilinguals. Is bilingualism a blessing or a curse? The linguistic and cognitive consequences of individual, societal, and political bilingualism are examined.

Article

Cedric Boeckx and Pedro Tiago Martins

All humans can acquire at least one natural language. Biolinguistics is the name given to the interdisciplinary enterprise that aims to unveil the biological bases of this unique capacity.

Article

Children’s acquisition of language is an amazing feat. Children master the syntax, the sentence structure of their language, through exposure and interaction with caregivers and others but, notably, with no formal tuition. How children come to be in command of the syntax of their language has been a topic of vigorous debate since Chomsky argued against Skinner’s claim that language is ‘verbal behavior.’ Chomsky argued that knowledge of language cannot be learned through experience alone but is guided by a genetic component. This language component, known as ‘Universal Grammar,’ is composed of abstract linguistic knowledge and a computational system that is special to language. The computational mechanisms of Universal Grammar give even young children the capacity to form hierarchical syntactic representations for the sentences they hear and produce. The abstract knowledge of language guides children’s hypotheses as they interact with the language input in their environment, ensuring they progress toward the adult grammar. An alternative school of thought denies the existence of a dedicated language component, arguing that knowledge of syntax is learned entirely through interactions with speakers of the language. Such ‘usage-based’ linguistic theories assume that language learning employs the same learning mechanisms that are used by other cognitive systems. Usage-based accounts of language development view children’s earliest productions as rote-learned phrases that lack internal structure. Knowledge of linguistic structure emerges gradually and in a piecemeal fashion, with frequency playing a large role in the order of emergence for different syntactic structures.

Article

Daniel Recasens

The study of coarticulation—namely, the articulatory modification of a given speech sound arising from coproduction or overlap with neighboring sounds in the speech chain—has attracted the close attention of phonetic researchers for at least the last 60 years. Knowledge about coarticulatory patterns in speech should provide information about the planning mechanisms of consecutive consonants and vowels and the execution of coordinative articulatory structures during the production of those segmental units. Coarticulatory effects involve changes in articulatory displacement over time toward the left (anticipatory) or the right (carryover) of the trigger, and their typology and extent depend on the articulator under investigation (lip, velum, tongue, jaw, larynx) and the articulatory characteristics of the individual consonants and vowels, as well as nonsegmental factors such as speech rate, stress, and language. A challenge for studying coarticulation is that different speakers may use different coarticulatory mechanisms when producing a given phonemic sequence and they also use coarticulatory information differently for phonemic identification in perception. More knowledge about all these research issues should contribute to a deeper understanding of coarticulation deficits in speakers with speech disorders, how the ability to coarticulate develops from childhood to adulthood, and the extent to which the failure to compensate for coarticulatory effects may give rise to sound change.

Article

Pius ten Hacken

Compounding is a word formation process based on the combination of lexical elements (words or stems). In the theoretical literature, compounding is discussed controversially, and the disagreement also concerns basic issues. In the study of compounding, the questions guiding research can be grouped into four main areas, labeled here as delimitation, classification, formation, and interpretation. Depending on the perspective taken in the research, some of these may be highlighted or backgrounded. In the delimitation of compounding, one question is how important it is to be able to determine for each expression unambiguously whether it is a compound or not. Compounding borders on syntax and on affixation. In some theoretical frameworks, it is not a problem to have more typical and less typical instances, without a precise boundary between them. However, if, for instance, word formation and syntax are strictly separated and compounding is in word formation, it is crucial to draw this borderline precisely. Another question is which types of criteria should be used to distinguish compounding from other phenomena. Criteria based on form, on syntactic properties, and on meaning have been used. In all cases, it is also controversial whether such criteria should be applied crosslinguistically. In the classification of compounds, the question of how important the distinction between the classes is for the theory in which they are used poses itself in much the same way as the corresponding question for the delimitation. A common classification uses headedness as a basis. Other criteria are based on the forms of the elements that are combined (e.g., stem vs. word) or on the semantic relationship between the components. Again, whether these criteria can and should be applied crosslinguistically is controversial. The issue of the formation rules for compounds is particularly prominent in frameworks that emphasize form-based properties of compounding. Rewrite rules for compounding have been proposed, generalizations over the selection of the input form (stem or word) and of linking elements, and rules for stress assignment. Compounds are generally thought of as consisting of two components, although these components may consist of more than one element themselves. For some types of compounds with three or more components, for example copulative compounds, a nonbinary structure has been proposed. The question of interpretation can be approached from two opposite perspectives. In a semasiological perspective, the meaning of a compound emerges from the interpretation of a given form. In an onomasiological perspective, the meaning precedes the formation in the sense that a form is selected to name a particular concept. The central question in the interpretation of compounds is how to determine the relationship between the two components. The range of possible interpretations can be constrained by the rules of compounding, by the semantics of the components, and by the context of use. A much-debated question concerns the relative importance of these factors.

Article

Grant Goodall

The term coordination refers to the juxtaposition of two or more conjuncts often linked by a conjunction such as and or or. The conjuncts (e.g., our friend and your teacher in Our friend and your teacher sent greetings) may be words or phrases of any type. They are a defining property of coordination, while the presence or absence of a conjunction depends on the specifics of the particular language. As a general phenomenon, coordination differs from subordination in that the conjuncts are typically symmetric in many ways: they often belong to like syntactic categories, and if nominal, each carries the same case. Additionally, if there is extraction, this must typically be out of all conjuncts in parallel, a phenomenon known as Across-the-Board extraction. Extraction of a single conjunct, or out of a single conjunct, is prohibited by the Coordinate Structure Constraint. Despite this overall symmetry, coordination does sometimes behave in an asymmetric fashion. Under certain circumstances, the conjuncts may be of unlike categories or extraction may occur out of one conjunct, but not another, thus yielding apparent violations of the Coordinate Structure Constraint. In addition, case and agreement show a wide range of complex and sometimes asymmetric behavior cross-linguistically. This tension between the symmetric and asymmetric properties of coordination is one of the reasons that coordination has remained an interesting analytical puzzle for many decades. Within the general area of coordination, a number of specific sentence types have generated much interest. One is Gapping, in which two sentences are conjoined, but material (often the verb) is missing from the middle of the second conjunct, as in Mary ate beans and John _ potatoes. Another is Right Node Raising, in which shared material from the right edge of sentential conjuncts is placed in the right periphery of the entire sentence, as in The chefs prepared __ and the customers ate __ [a very elaborately constructed dessert]. Finally, some languages have a phenomenon known as comitative coordination, in which a verb has two arguments, one morphologically plural and the other comitative (e.g., with the preposition with), but the plural argument may be understood as singular. English does not have this phenomenon, but if it did, a sentence like We went to the movies with John could be understood as John and I went to the movies.

Article

Pieter Muysken

Creole languages have a curious status in linguistics, and at the same time they often have very low prestige in the societies in which they are spoken. These two facts may be related, in part because they circle around notions such as “derived from” or “simplified” instead of “original.” Rather than simply taking the notion of “creole” as a given and trying to account for its properties and origin, this essay tries to explore the ways scholars have dealt with creoles. This involves, in particular, trying to see whether we can define “creoles” as a meaningful class of languages. There is a canonical list of languages that most specialists would not hesitate to call creoles, but the boundaries of the list and the criteria for being listed are vague. It also becomes difficult to distinguish sharply between pidgins and creoles, and likewise the boundaries between some languages claimed to be creoles and their lexifiers are rather vague. Several possible criteria to distinguish creoles will be discussed. Simply defining them as languages of which we know the point of birth may be a necessary, but not sufficient, criterion. Displacement is also an important criterion, necessary but not sufficient. Mixture is often characteristic of creoles, but not crucial, it is argued. Essential in any case is substantial restructuring of some lexifier language, which may take the form of morphosyntactic simplification, but it is dangerous to assume that simplification always has the same outcome. The combination of these criteria—time of genesis, displacement, mixture, restructuring—contributes to the status of a language as creole, but “creole” is far from a unified notion. There turn out to be several types of creoles, and then a whole bunch of creole-like languages, and they differ in the way these criteria are combined with respect to them. Thus the proposal is made here to stop looking at creoles as a separate class, but take them as special cases of the general phenomenon that the way languages emerge and are used to a considerable extent determines their properties. This calls for a new, socially informed typology of languages, which will involve all kinds of different types of languages, including pidgins and creoles.

Article

Rochelle Lieber

Derivational morphology is a type of word formation that creates new lexemes, either by changing syntactic category or by adding substantial new meaning (or both) to a free or bound base. Derivation may be contrasted with inflection on the one hand or with compounding on the other. The distinctions between derivation and inflection and between derivation and compounding, however, are not always clear-cut. New words may be derived by a variety of formal means including affixation, reduplication, internal modification of various sorts, subtraction, and conversion. Affixation is best attested cross-linguistically, especially prefixation and suffixation. Reduplication is also widely found, with various internal changes like ablaut and root and pattern derivation less common. Derived words may fit into a number of semantic categories. For nouns, event and result, personal and participant, collective and abstract noun are frequent. For verbs, causative and applicative categories are well-attested, as are relational and qualitative derivations for adjectives. Languages frequently also have ways of deriving negatives, relational words, and evaluatives. Most languages have derivation of some sort, although there are languages that rely more heavily on compounding than on derivation to build their lexical stock. A number of topics have dominated the theoretical literature on derivation, including productivity (the extent to which new words can be created with a given affix or morphological process), the principles that determine the ordering of affixes, and the place of derivational morphology with respect to other components of the grammar. The study of derivation has also been important in a number of psycholinguistic debates concerning the perception and production of language.

Article

Terttu Nevalainen

In the Early Modern English period (1500–1700), steps were taken toward Standard English, and this was also the time when Shakespeare wrote, but these perspectives are only part of the bigger picture. This chapter looks at Early Modern English as a variable and changing language not unlike English today. Standardization is found particularly in spelling, and new vocabulary was created as a result of the spread of English into various professional and occupational specializations. New research using digital corpora, dictionaries, and databases reveals the gradual nature of these processes. Ongoing developments were no less gradual in pronunciation, with processes such as the Great Vowel Shift, or in grammar, where many changes resulted in new means of expression and greater transparency. Word order was also subject to gradual change, becoming more fixed over time.

Article

Chris Rogers and Lyle Campbell

The reduction of the world’s linguistic diversity has accelerated over the last century and correlates to a loss of knowledge, collective and individual identity, and social value. Often a language is pushed out of use before scholars and language communities have a chance to document or preserve this linguistic heritage. Many are concerned for this loss, believing it to be one of the most serious issues facing humanity today. To address the issues concomitant with an endangered language, we must know how to define “endangerment,” how different situations of endangerment can be compared, and how each language fits into the cultural practices of individuals. The discussion about endangered languages focuses on addressing the needs, causes, and consequences of this loss. Concern over endangered languages is not just an academic catch phrase. It involves real people and communities struggling with real social, political, and economic issues. To understand the causes and consequence of language endangerment for these individuals and communities requires a multifaceted perspective on the place of each language in the lives of their users. The loss of a language affects not only the world’s linguistic diversity but also an individual’s social identity, and a community’s sense of itself and its history.

Article

Geoffrey K. Pullum

English is both the most studied of the world’s languages and the most widely used. It comes closer than any other language to functioning as a world communication medium and is very widely used for governmental purposes. This situation is the result of a number of historical accidents of different magnitudes. The linguistic properties of the language itself would not have motivated its choice (contra the talk of prescriptive usage writers who stress the clarity and logic that they believe English to have). Divided into multiple dialects, English has a phonological system involving remarkably complex consonant clusters and a large inventory of distinct vowel nuclei; a bad, confusing, and hard-to-learn alphabetic orthography riddled with exceptions, ambiguities, and failures of the spelling to correspond to the pronunciation; a morphology that is rather more complex than is generally appreciated, with seven or eight paradigm patterns and a couple of hundred irregular verbs; a large multilayered lexicon containing roots of several quite distinct historical sources; and a syntax that despite its very widespread SVO (Subject-Verb-Object) basic order in the clause is replete with tricky details. For example, there are crucial restrictions on government of prepositions, many verb-preposition idioms, subtle constraints on the intransitive prepositions known as “particles,” an important distinction between two (or under a better analysis, three) classes of verb that actually have different syntax, and a host of restrictions on the use of its crucial “wh-words.” It is only geopolitical and historical accidents that have given English its enormous importance and prestige in the world, not its inherent suitability for its role.

Article

John E. Joseph

Ferdinand de Saussure (1857–1913), the founding figure of modern linguistics, made his mark on the field with a book he published a month after his 21st birthday, in which he proposed a radical rethinking of the original system of vowels in Proto-Indo-European. A year later, he submitted his doctoral thesis on a morpho-syntactic topic, the genitive absolute in Sanskrit, to the University of Leipzig. He went to Paris intending to do a second, French doctorate, but instead he was given responsibility for courses on Gothic and Old High Gerrman at the École Pratique des Hautes Études, and for managing the publications of the Société de Linguistique de Paris. He abandoned more than one large publication project of his own during the decade he spent in Paris. In 1891 he returned to his native Geneva, where the University created a chair in Sanskrit and the history and comparison of languages for him. He produced some significant work on Lithuanian during this period, connected to his early book on the Indo-European vowel system, and yielding Saussure’s Law, concerning the placement of stress in Lithuanian. He undertook writing projects about the general nature of language, but again abandoned them. In 1907, 1908–1909, and 1910–1911, he gave three courses in general linguistics at the University of Geneva, in which he developed an approach to languages as systems of signs, each sign consisting of a signifier (sound pattern) and a signified (concept), both of them mental rather than physical in nature, and conjoined arbitrarily and inseparably. The socially shared language system, or langue, makes possible the production and comprehension of parole, utterances, by individual speakers and hearers. Each signifier and signified is a value generated by its difference from all the other signifiers or signifieds with which it coexists on an associative (or paradigmatic) axis, and affected as well by its syntagmatic axis. Shortly after Saussure’s death at 55, two of his colleagues, Bally and Sechehaye, gathered together students’ notes from the three courses, as well as manuscript notes by Saussure, and from them constructed the Cours de linguistique générale, published in 1916. Over the course of the next several decades, this book became the basis for the structuralist approach, initially within linguistics, and later adapted to other fields. Saussure left behind a large quantity of manuscript material that has gradually been published over the last few decades, and continues to be published, shedding new light on his thought.

Article

First-language acquisition of morphology refers to the process whereby native speakers gain full and automatic command of the inflectional and derivational machinery of their mother tongue. Despite language diversity, evidence shows that morphological acquisition follows a shared path in development in evolving from semantically and structurally simplex and non-productive to more complex and productive. The emergence and consolidation of the central morphological systems in a language typically take place between the ages of two and six years, while mature command of all systems and subsystems can take up to 10 more years, and is mediated by the consolidation of literacy skills. Morphological learning in both inflection and derivation is always interwoven with lexical growth, and derivational acquisition is highly dependent on the development of a large and coherent lexicon. Three critical factors platform the acquisition of morphology. One factor is the input patterns in the ambient language, including various types of frequency. Input provides the context for children to pay attention to morphological markers as meaningful cues to caregivers’ intentions in interactive sociopragmatic settings of joint attention. A second factor is language typology, given that languages differ in the amount of word-internal information they package in words. The “typological impact” in morphology directs children to the ways pertinent conceptual and structural information is encoded in morphological structures. It is thus responsible for great differences among languages in the timing and pace of learning morphological categories such as passive verbs. Finally, development itself is a central mechanism that drives morphological acquisition from emergence to productivity in three senses: as the filtering device that enables the break into the morphological system, in providing the span of time necessary for the consolidation of morphological systems in children, and in hosting the cognitive changes that usher in mature morphological systems in both speech and writing in adolescents and adults.

Article

Gender  

Jenny Audring

Gender is a grammatical feature, in a family with person, number, and case. In the languages that have grammatical gender—according to a representative typological sample, almost half of the languages in the world—it is a property that separates nouns into classes. These classes are often meaningful and often linked to biological sex, which is why many languages are said to have a “masculine” and a “feminine” gender. A typical example is Italian, which has masculine words for male persons (il bambino “the.m little boy”) and feminine words for female persons (la bambina “the.f little girl”). However, gender systems may be based on other semantic distinctions or may reflect formal properties of the noun. In all cases, the defining property is agreement: the behavior of associated words. In Italian, the masculine gender of the noun bambino matches its meaning as well as its form—the noun ends in –o and inflects like a regular –o class noun—but the true indicator of gender is the form of the article. This can be seen in words like la mano “the.f hand,” which is feminine despite its final -o, and il soprano “the.m soprano,” which is masculine, although it usually refers to a woman. For the same reasons, we speak of grammatical gender only if the distinction is reflected in syntax; a language that has words for male and female persons or animals does not necessarily have a gender system. Across the languages of the world, gender systems vary widely. They differ in the number of classes, in the underlying assignment rules, and in how and where gender is marked. Since agreement is a definitional property, gender is generally absent in isolating languages as well as in young languages with little bound morphology, including sign languages. Therefore, gender is considered a mature phenomenon in language. Gender interacts in various ways with other grammatical features. For example, it may be limited to the singular number or the third person, and it may be crosscut by case distinctions. These and other interrelations can complicate the task of figuring out a gender system in first or second language acquisition. Yet, children master gender early, making use of a broad variety of cues. By contrast, gender is famously difficult for second-language learners. This is especially true for adults and for learners whose first language does not have a gender system. Nevertheless, tests show that even for this group, native-like competence is possible to attain.

Article

Knut Tarald Taraldsen

This article presents different types of generative grammar that can be used as models of natural languages focusing on a small subset of all the systems that have been devised. The central idea behind generative grammar may be rendered in the words of Richard Montague: “I reject the contention that an important theoretical difference exists between formal and natural languages” (“Universal Grammar,” Theoria, 36 [1970], 373–398).

Article

Béatrice Godart-Wendling

The term “philosophy of language” is intrinsically paradoxical: it denominates the main philosophical current of the 20th century but is devoid of any univocal definition. While the emergence of this current was based on the idea that philosophical questions were only language problems that could be elucidated through a logico-linguistic analysis, the interest in this approach gave rise to philosophical theories that, although having points of convergence for some of them, developed very different philosophical conceptions. The only constant in all these theories is the recognition that this current of thought originated in the work of Gottlob Frege (b. 1848–d. 1925), thus marking what was to be called “the linguistic turn.” Despite the theoretical diversity within the philosophy of language, the history of this current can however be traced in four stages: The first one began in 1892 with Frege’s paper “Über Sinn und Bedeutung” and aimed to clarify language by using the rules of logic. The Fregean principle underpinning this program was that we must banish psychological considerations from linguistic analysis in order to avoid associating the meaning of words with mental pictures or states. The work of Frege, Bertrand Russell (1872–1970), George Moore (1873–1958), Ludwig Wittgenstein (1921), Rudolf Carnap (1891–1970), and Willard Van Orman Quine (1908–2000) is representative of this period. In this logicist point of view, the questions raised mainly concerned syntax and semantics, since the goal was to define a formalism able to represent the structure of propositions and to explain how language can describe the world by mirroring it. The problem specific to this period was therefore the function of representing the world by language, thus placing at the heart of the philosophical debate the notions of reference, meaning, and truth. The second phase of the philosophy of language was adumbrated in the 1930s with the courses given by Wittgenstein (1889–1951) in Cambridge (The Blue and Brown Books), but it did not really take off until 1950–1960 with the work of Peter Strawson (1919–2006), Wittgenstein (1953), John Austin (1911–1960), and John Searle (1932–). In spite of the very different approaches developed by these theorists, the two main ideas that characterized this period were: one, that only the examination of natural (also called “ordinary”) language can give access to an understanding of how language functions, and two, that the specificity of this language resides in its ability to perform actions. It was therefore no longer a question of analyzing language in logical terms, but rather of considering it in itself, by examining the meaning of statements as they are used in given contexts. In this perspective, the pivotal concepts explored by philosophers became those of (situated) meaning, felicity conditions, use, and context. The beginning of the 1970s initiated the third phase of this movement by orienting research toward two quite distinct directions. The first, resulting from the work on proper names, natural-kind words, and indexicals undertaken by the logician philosophers Saul Kripke (1940–), David Lewis (1941–2001), Hilary Putnam (1926–2016), and David Kaplan (1933–), brought credibility to the semantics of possible worlds. The second, conducted by Paul Grice (1913–1988) on human communicational rationality, harked back to the psychologism dismissed by Frege and conceived of the functioning of language as highly dependent on a theory of mind. The focus was then put on the inferences that the different protagonists in a linguistic exchange construct from the recognition of hidden intentions in the discourse of others. In this perspective, the concepts of implicitness, relevance, and cognitive efficiency became central and required involving a greater number of contextual parameters to account for them. In the wake of this research, many theorists turned to the philosophy of mind as evidenced in the late 1980s by the work on relevance by Dan Sperber (1942–) and Deirdre Wilson (1941–). The contemporary period, marked by the thinking of Robert Brandom (1950–) and Charles Travis (1943–), is illustrated by its orientation toward a radical contextualism and the return of inferentialism that draws strongly on Frege. Within these theoretical frameworks, the notions of truth and reference no longer fall within the field of semantics but rather of pragmatics. The emphasis is placed on the commitment that the speakers make when they speak, as well as on their responsibility with respect to their utterances.

Article

Silvio Moreira de Sousa, Johannes Mücke, and Philipp Krämer

As an institutionalized subfield of academic research, Creole studies (or Creolistics) emerged in the second half of the 20th century on the basis of pioneering works in the last decades of the 19th century and first half of the 20th century. Yet its research traditions—just like the Creole languages themselves—are much older and are deeply intertwined with the history of European colonialism, slavery, and Christian missionary activities all around the globe. Throughout the history of research, creolists focused on the emergence of Creole languages and their grammatical structures—often in comparison to European colonial languages. In connection with the observations in grammar and history, creolists discussed theoretical matters such as the role of language acquisition in creolization, the status of Creoles among the other languages in the world, and the social conditions in which they are or were spoken. These discussions molded the way in which the acquired knowledge was transmitted to the following generations of creolists.