1-20 of 27 Results

  • Keywords: convergence x
Clear all

Article

A Survey of Econometric Approaches to Convergence Tests of Emissions and Measures of Environmental Quality  

Junsoo Lee, James E. Payne, and Md. Towhidul Islam

The analysis of convergence behavior with respect to emissions and measures of environmental quality can be categorized into four types of tests: absolute and conditional β-convergence, σ-convergence, club convergence, and stochastic convergence. In the context of emissions, absolute β-convergence occurs when countries with high initial levels of emissions have a lower emission growth rate than countries with low initial levels of emissions. Conditional β-convergence allows for possible differences among countries through the inclusion of exogenous variables to capture country-specific effects. Given that absolute and conditional β-convergence do not account for the dynamics of the growth process, which can potentially lead to dynamic panel data bias, σ-convergence evaluates the dynamics and intradistributional aspects of emissions to determine whether the cross-section variance of emissions decreases over time. The more recent club convergence approach tests the decline in the cross-sectional variation in emissions among countries over time and whether heterogeneous time-varying idiosyncratic components converge over time after controlling for a common growth component in emissions among countries. In essence, the club convergence approach evaluates both conditional σ- and β-convergence within a panel framework. Finally, stochastic convergence examines the time series behavior of a country’s emissions relative to another country or group of countries. Using univariate or panel unit root/stationarity tests, stochastic convergence is present if relative emissions, defined as the log of emissions for a particular country relative to another country or group of countries, is trend-stationary. The majority of the empirical literature analyzes carbon dioxide emissions and varies in terms of both the convergence tests deployed and the results. While the results supportive of emissions convergence for large global country coverage are limited, empirical studies that focus on country groupings defined by income classification, geographic region, or institutional structure (i.e., EU, OECD, etc.) are more likely to provide support for emissions convergence. The vast majority of studies have relied on tests of stochastic convergence with tests of σ-convergence and the distributional dynamics of emissions less so. With respect to tests of stochastic convergence, an alternative testing procedure accounts for structural breaks and cross-correlations simultaneously is presented. Using data for OECD countries, the results based on the inclusion of both structural breaks and cross-correlations through a factor structure provides less support for stochastic convergence when compared to unit root tests with the inclusion of just structural breaks. Future studies should increase focus on other air pollutants to include greenhouse gas emissions and their components, not to mention expanding the range of geographical regions analyzed and more robust analysis of the various types of convergence tests to render a more comprehensive view of convergence behavior. The examination of convergence through the use of eco-efficiency indicators that capture both the environmental and economic effects of production may be more fruitful in contributing to the debate on mitigation strategies and allocation mechanisms.

Article

An Introduction to Bootstrap Theory in Time Series Econometrics  

Giuseppe Cavaliere, Heino Bohn Nielsen, and Anders Rahbek

While often simple to implement in practice, application of the bootstrap in econometric modeling of economic and financial time series requires establishing validity of the bootstrap. Establishing bootstrap asymptotic validity relies on verifying often nonstandard regularity conditions. In particular, bootstrap versions of classic convergence in probability and distribution, and hence of laws of large numbers and central limit theorems, are critical ingredients. Crucially, these depend on the type of bootstrap applied (e.g., wild or independently and identically distributed (i.i.d.) bootstrap) and on the underlying econometric model and data. Regularity conditions and their implications for possible improvements in terms of (empirical) size and power for bootstrap-based testing differ from standard asymptotic testing, which can be illustrated by simulations.

Article

Policy Issues Surrounding Broadcasting  

Hilde Van den Bulck

In Europe and elsewhere broadcasting is considered by some a “thing of the past,” and broadcasting policy subsequently as hard to develop or even no longer relevant. Broadcasting has indeed seen a considerable number of changes since its inception in the 20th century and this has created policy challenges brought on by the evolving market for audio-visual content, policymakers, and various stakeholders. In its early and “golden” years, broadcasting policies where incited by a social responsibility in thinking about the relationship between the media and the state, resulting mostly in public service broadcasting monopolies. In the 1980s these monopolies were replaced by a liberalization of broadcasting policies and markets which led to a multichannel, commercializing television landscape. Digitization and ensuing and ongoing convergence have further changed the media landscape in recent decades, questioning old boundaries between once distinct media types and markets and opening up traditional media markets to new players. As a result, the traditional process of production and distribution, the valorization of this work in the different phases hereof (the so-called value chain), and the accompanying distribution of costs and revenues (the business model) have been and are being subjected to considerable changes. For instance, “free-to-air,” that is, traditional linear broadcasting, has stopped being the only channel of distribution as “video-on-demand” (VoD), pay television, “over-the-top content”-services (OTT), and other platforms and services bring products to new and different markets, allowing for a diversification across several valorization “windows.” Broadcasting has evolved into an audiovisual industry which poses new challenges to media policymakers as the ex ante testing for new public services and signal integrity cases illustrate. Broadcasting thus is not so much dying as constantly transforming, posing ever new changes to policymakers.

Article

Commodity Market Integration  

Giovanni Federico

The literature on market integration explores the development of the commodity market with data on prices, which is a useful complement to analysis of trade and the only feasible approach when data on trade are not available. Data on prices and quantity can help in understanding when markets developed, why, and the degree to which their development increased welfare and economic growth. Integration progressed slowly throughout the early modern period, with significant acceleration in the first half of the 19th century. Causes of integration include development of transportation infrastructure, changes in barriers to trade, and short-term shocks, such as wars. Literature on the effects of market integration is limited and strategies for estimating the effects of market integration are must be developed.

Article

The Environmental Kuznets Curve  

David I. Stern

The environmental Kuznets curve (EKC) is a hypothesized relationship between environmental degradation and GDP per capita. In the early stages of economic growth, pollution emissions and other human impacts on the environment increase, but beyond some level of GDP per capita (which varies for different indicators), the trend reverses, so that at high income levels, economic growth leads to environmental improvement. This implies that environmental impacts or emissions per capita are an inverted U-shaped function of GDP per capita. The EKC has been the dominant approach among economists to modeling ambient pollution concentrations and aggregate emissions since Grossman and Krueger introduced it in 1991 and is even found in introductory economics textbooks. Despite this, the EKC was criticized almost from the start on statistical and policy grounds, and debate continues. While concentrations and also emissions of some local pollutants, such as sulfur dioxide, have clearly declined in developed countries in recent decades, evidence for other pollutants, such as carbon dioxide, is much weaker. Initially, many understood the EKC to imply that environmental problems might be due to a lack of sufficient economic development, rather than the reverse, as was conventionally thought. This alarmed others because a simplistic policy prescription based on this idea, while perhaps addressing some issues like deforestation or local air pollution, could exacerbate environmental problems like climate change. Additionally, many of the econometric studies that supported the EKC were found to be statistically fragile. Some more recent research integrates the EKC with alternative approaches and finds that the relation between environmental impacts and development is subtler than the simple picture painted by the EKC. This research shows that usually, growth in the scale of the economy increases environmental impacts, all else held constant. However, the impact of growth might decline as countries get richer, and richer countries are likely to make more rapid progress in reducing environmental impacts. Finally, there is often convergence among countries, so that countries that have relatively high levels of impacts reduce them more quickly or increase them more slowly, all else held constant.

Article

Growth Econometrics  

Jonathan R. W. Temple

Growth econometrics is the application of statistical methods to the study of economic growth and levels of national output or income per head. Researchers often seek to understand why growth rates differ across countries. The field developed rapidly in the 1980s and 1990s, but the early work often proved fragile. Cross-section analyses are limited by the relatively small number of countries in the world and problems of endogeneity, parameter heterogeneity, model uncertainty, and cross-section error dependence. The long-term prospects look better for approaches using panel data. Overall, the quality of the evidence has improved over time, due to better measurement, more data, and new methods. As longer spans of data become available, the methods of growth econometrics will shed light on fundamental questions that are hard to answer any other way.

Article

Public Reason and Contemporary Political Theory  

Steven Wall

The publication of John Rawls’s Political Liberalism put public reason squarely on the agenda of contemporary political theory. Ever since, it has been a central topic in the field. Although Rawls developed a distinctive account of public reason, his account is but one among many. Indeed, some commentators have insisted that public reason is a very old notion, one that can be found in the political writings of Hobbes, Locke, Rousseau, and Kant, for example. Public reason has a distinctive subject matter. It applies to the common good of a modern political society and the political institutions that serve that common good, and it contrasts with forms of reasoning that apply to less inclusive associations and communities that exist within a modern political society, such as churches, voluntary clubs, or professional associations. Public reason also contrasts with applications of reason that are not transparent and/or acceptable to adult citizens of modern political societies. The demands of transparency and acceptability have proven to be complex and contentious, and rival articulations of these notions have generated rival accounts of public reason. Public reason informs public political justification, and proponents of public reason often hold that public, political justification of at least the fundamental political arrangements of a political society is necessary for its political order to be legitimate. The reasons for insisting on public reason and the reasons for rejecting it are diverse. Common to all defenses of public reason is the thought that it represents a fitting response to the fact of intractable disagreement in modern political societies.

Article

History of Japanese Labor and Production Management  

William M. Tsutsui

Tracking with Japan’s macroeconomic fortunes since World War II, global interest in Japanese management practices emerged in the 1950s with the start of Japan’s “miracle economy,” soared in the 1980s as Japanese industrial exports threatened manufacturers around the world, and declined after 1990 as Japan’s growth stalled. Japanese techniques, especially in labor and production management, fascinated Western scholars and practitioners in their striking divergence from U.S. and European conventions and their apparent advantages in creating harmonious, highly productive workplaces. Two reductive approaches to the origins of Japan’s distinctive management methods―one asserting they were the organic outgrowth of Japan’s unique cultural heritage, the other stressing Japan’s proficiency at emulating and adapting American models—came to dominate the academic and popular literature. As historical analysis reveals, however, such stylized interpretations distort the complex evolution of Japanese industrial management over the past century and shed little light on the current debates over the potential convergence of Japanese practices and American management norms. Key features of the Japanese model of labor management—“permanent” employment, seniority-based wages and promotions, and enterprise unions—developed between the late 1800s and the 1950s from the contentious interaction of workers, managers, and government bureaucrats. The distinctive “Japanese Employment System” that emerged reflected both employers’ priorities (for low labor turnover and the affirmation of managerial authority in the workplace) and labor’s demands (for employment security and respect as full members of the firm). Since 1990, despite the widespread perception that Japanese labor management is inefficient and inflexible by international standards, many time-honored practices have endured, as Japanese corporations have pursued adaptive, incremental change rather than precipitous convergence toward a more market-oriented American model. The distinguishing elements of Japanese production management—the “lean production” system and just-in-time manufacturing pioneered in Toyota factories, innovative quality-control practices—also evolved slowly over the first century of Japanese industrialization. Imported management paradigms (especially Frederick Taylor’s scientific management) had a profound long-term impact on Japanese shop-floor methods, but Japanese managers were creative in adapting American practices to Japan’s realities and humanizing the rigid structures of Taylorism. Japanese production management techniques were widely diffused internationally from the 1980s, but innovation has slowed in Japanese manufacturing in recent decades and Japanese firms have struggled to keep pace with latest management advances from the United States and Europe. In sum, the histories of Japanese labor and production management cannot be reduced to simple narratives of cultural determinism, slavish imitation, or inevitable convergence. Additional research on Japanese practices in a wide range of firms, industries, sectors, regions, and historical periods is warranted to further nuance our understanding of the complex evolution, diverse forms, and contingent future of Japanese management.

Article

Message Convergence Framework Applied to Health and Risk Messaging  

Kathryn E. Anthony, Timothy L. Sellnow, Steven J. Venette, and Sean P. Fourney

Much current scholarship in the realm of information processing and decision making, particularly in the context of health risks, is derived from the logical-empiricist paradigm, involving a strong focus on cognition, routes of psychological processing of messages, and message heuristics. The message convergence framework (MCF), derived heavily from the writings of Perelman and Olbrechts-Tyteca, contributes to this body of literature by emphasizing the fact that people make decisions on health risks while being exposed to arguments from multiple sources on the same topic. The MCF offers an explanation for how people reconcile myriad messages to arrive at decisions. MCF differs from other theories of message processing because of its distinct and unique focus on arguments, messages, and the ways various arguments interact to create “convergence” in individuals’ minds. The MCF focuses on the ways that multiple messages converge to create meaning and influence in the minds of listeners. Convergence occurs when messages from multiple sources overlap in ways recognized by observers, creating perceptions of credibility and influencing their risk decisions. Perelman and Olbrechts-Tyteca explain that convergence occurs when “several distinct arguments lead to a single conclusion.” Individuals assess the strengths and weaknesses of the claims, and according to the scholars, the “strength” of the arguments “is almost always recognized.” Three key propositions focusing on message convergence articulate that audiences recognize message convergence, that they actively seek convergence in matters of concern, such as health risk, and that this convergence is potentially fleeting as new messages are introduced to the discussion. Conversely, Perelman and Olbrechts-Tyteca also discuss message divergence, and the rationale for wanting to intentionally create divergence among interacting arguments. Divergence is particularly appropriate in the realm of health and risk messages when scholars must challenge potentially harmful beliefs or correct misinformation. Some strategies for invoking divergence in include: dissociation, in which the speaker attempts to reframe the argument to create novel understandings; identification of the stock, hackneyed, and obsolete, where the speaker attempts to make existing claims appear commonplace or obsolete to the listener; refutation of fallacies, where the speaker points out the fallacious reasoning of the opponent; clash of interpretation, where the speaker publicly articulates that individuals have understood the convergence to mean different things; weakening through reaction, which involves the speaker’s attempting to incite a reactionary approach by the opponent; and finally, highlighting the consequence of invalid convergence, where the speaker describes the negative outcomes that may occur from following a false convergence based on incorrect information. For message design, environmental scanning enables scholars and practitioners to assess the messages in a particular health-risk context. This assessment can assist practitioners in emphasizing or building convergence among reputable sources and in introducing divergence in cases where misunderstanding or a lack of evidence has contributed to an unproductive perception of convergence. Ultimately, the MCF can assist practitioners in scanning their health-risk environments for opportunities to establish or bolster convergence based on credible evidence and for introducing divergence to challenge inaccurate or misleading interpretations and evidence.

Article

New Public Management  

Per Lægreid

New Public Management (NPM) reforms have been around in many countries for over the past 30 years. NPM is an ambiguous, multifaceted, and expanded concept. There is not a single driving force behind it, but rather a mixture of structural and polity features, national historical-institutional contexts, external pressures, and deliberate choices from political and administrative executives. NPM is not the only show in town, and contextual features matter. There is no convergence toward one common NPM model, but significant variations exist between countries, government levels, policy areas, tasks, and over time. Its effects have been found to be ambiguous, inconclusive, and contested. Generally, there is a lack of reliable data on results and implications, and there is some way to go before one can claim evidence-based policymaking in this field. There is more knowledge regarding NPM’s effects on processes and activities than on outcome, and reliable comparative data on variations over time and across countries are missing. NPM has enhanced managerial accountability and accountability to users and customers, but has this success been at the expense of political accountability? New trends in reforms, such as whole-of-government, have been added to NPM, thereby making public administration more complex and hybrid.

Article

God and Chance: Christian Perspectives  

David J. Bartholomew

In many quarters God and chance are still seen as mutually exclusive alternatives. It is common to hear that ascribing anything to “chance” rules out God’s action. Recent scientific developments have tended to reinforce that distinction. Quantum theory introduced an irreducible uncertainty at the atomic level by requiring that certain microscopic physical events were unpredictable in principle. This was followed by the biologists’ claim that mutations, on which evolution depends, were effectively random and hence that evolutionary development was undirected. The problem this posed to Christian apologists was put most forcibly by Jacques Monod when he asserted “Pure chance,… at the root of the stupendous edifice of evolution alone is the source of every innovation.” Several attempts have been made to include chance within a theistic account. One, advocated by the intelligent design movement, is to contend that some biological structures are too complex to have originated in the way that evolutionary theory supposes and therefore that they must be attributed to God. Another is to suppose that God acts in an undetectable way at the quantum level without destroying the random appearance of what goes on there. A third approach is to contend that chance is real and hence is a means by which God works. A key step in this argument is the recognition that chance and order are not mutually exclusive. Reality operates at a number of different levels of aggregation so that what is attributable to chance at one level emerges as near certainty at a higher level. Further arguments, based on what is known as the anthropic principle, are also used to judge whether or not chance is sufficient to account for existence. These are critically evaluated.

Article

Media Convergence Policy Issues  

Robin Mansell

Digital technologies are frequently said to have converged. This claim may be made with respect to the technologies themselves or to restructuring of the media industry over time. Innovations that are associated with digitalization (representing analogue signals by binary digits) often emerge in ways that cross the boundaries of earlier industries. When this occurs, technologies may be configured in new ways and the knowledge that supports the development of services and applications becomes complex. In the media industries, the convergence phenomenon has been very rapid, and empirical evidence suggests that the (de)convergence of technologies and industries also needs to be taken into account to understand change in this area. There is a very large literature that seeks to explain why convergence and (de)convergence phenomena occur. Some of this literature looks for economic and market-based explanations on the supply side of the industry, whereas other approaches explore the cultural, social, and political demand side factors that are important in shaping innovation in the digital media sector and the often unexpected pathways that it takes. Developments in digital media are crucially important because they are becoming a cornerstone of contemporary information societies. The benefits of digital media are often heralded in terms of improved productivity, opportunities to construct multiple identities through social media, new connections between close and distant others, and a new foundation for democracy and political mobilization. The risks associated with these technologies are equally of concern in part because the spread of digital media gives rise to major challenges. Policymakers are tasked with governing these technologies and issues of privacy protection, surveillance, and commercial security as well as ensuring that the skills base is appropriate to the digital media ecology need to be addressed. The complexity of the converged landscape makes it difficult to provide straightforward answers to policy problems. Policy responses also need to be compatible with the cultural, social, political, and economic environments in different countries and regions of the world. This means that these developments must be examined from a variety of disciplinary perspectives and need to be understood in their historical context so as take both continuities and discontinuities in the media industry landscape into account.

Article

Convergence in/of Journalism  

Ivar John Erdal

Since the mid-1990s, media organizations all over the world have experienced a series of significant changes related to technological developments, from the organizational level down to the single journalist. Ownership in the media sector has developed toward increased concentration, mergers, and cross-media ownership. At the same time, digitization of media production has facilitated changes in both the organization and the everyday practice of journalism. Converged multimedia news organizations have emerged, as companies increasingly implement some form of cross-media cooperation or synergy between previously separate journalists, newsrooms, and departments. These changes have raised a number of questions about the relationship between organizational strategies, new technology, and everyday newsroom practice. In the literature on convergence journalism, these questions have been studied from different perspectives. Adopting a meta-perspective, it is possible to sort the literature into two broad categories. The first group consists of research mainly occupied with convergence in journalism. These are typically studies of organizational changes and changes in professional practice, for example increased cooperation between print and online newsrooms, or the role of online journalism in broadcasting organizations. The second group contains research primarily concerning convergence of journalism. This is mainly studies concerned with changes in journalistic texts. Some examples of this are repurposing television news for online publication, increased use of multimedia, and genre development within online journalism. It has to be noted that the two angles are closely connected and also share an interest in the role of technological development and the relationship between changing technologies, work practices, and journalistic output.

Article

The Functional Organization of Vertebrate Retinal Circuits for Vision  

Tom Baden, Timm Schubert, Philipp Berens, and Thomas Euler

Visual processing begins in the retina—a thin, multilayered neuronal tissue lining the back of the vertebrate eye. The retina does not merely read out the constant stream of photons impinging on its dense array of photoreceptor cells. Instead it performs a first, extensive analysis of the visual scene, while constantly adapting its sensitivity range to the input statistics, such as the brightness or contrast distribution. The functional organization of the retina abides to several key organizational principles. These include overlapping and repeating instances of both divergence and convergence, constant and dynamic range-adjustments, and (perhaps most importantly) decomposition of image information into parallel channels. This is often referred to as “parallel processing.” To support this, the retina features a large diversity of neurons organized in functionally overlapping microcircuits that typically uniformly sample the retinal surface in a regular mosaic. Ultimately, each circuit drives spike trains in the retina’s output neurons, the retinal ganglion cells. Their axons form the optic nerve to convey multiple, distinctive, and often already heavily processed views of the world to higher visual centers in the brain. From an experimental point of view, the retina is a neuroscientist’s dream. While part of the central nervous system, the retina is largely self-contained, and depending on the species, it receives little feedback from downstream stages. This means that the tissue can be disconnected from the rest of the brain and studied in a dish for many hours without losing its functional integrity, all while retaining excellent experimental control over the exclusive natural network input: the visual stimulus. Once removed from the eyecup, the retina can be flattened, thus its neurons are easily accessed optically or using visually guided electrodes. Retinal tiling means that function studied at any one place can usually be considered representative for the entire tissue. At the same time, species-dependent specializations offer the opportunity to study circuits adapted to different visual tasks: for example, in case of our fovea, high-acuity vision. Taken together, today the retina is amongst the best understood complex neuronal tissues of the vertebrate brain.

Article

Poverty in South Asia: An Intellectual History  

Shailaja Fennell

The Oxford English Dictionary defines poverty as “destitution” with respect to lack of wealth and material possessions. It denotes a condition where an individual has inadequate resources and earnings to afford those necessities they require in order to stay alive and well. This condition can stem from extraneous shocks, such as the death of the head of the household or a poor harvest, or can result from systematic factors like power relations or institutions that have, since ancient times, kept some groups in society in precarious conditions. Descriptions of poverty are plentiful in ancient and medieval texts, which tend to characterize poverty with regard to natural, cultural, and personal features. In sharp contrast, the emergence of poverty as a public policy concern did not become evident until the latter part of the 19th century. It is also noteworthy that the means of measuring poverty that began to emerge in 19th and early 20th centuries identified poverty as a cultural or individual trait, rather than as a consequence of legal or administrative policy making. These latter day quantitative methods of measurement also provide the earliest evidence base for the design of public policies for poverty alleviation and advancing human development.

Article

Monitoring Migrants’ Health Risk Factors for Noncommunicable Diseases  

Stefano Campostrini

Noncommunicable diseases (NCDs) have become the first cause of morbidity and mortality around the world. These have been targeted by most governments because they are associated with well-known risk factors and modifiable behaviors. Migrants present, as any population subgroup, peculiarities with regard to NCDs and, more relevantly, need specific information on associated risk factors to appropriately target policies and interventions. The country of origin, assimilation process, and many other migrant health aspects well studied in the literature can be related to migrants’ health risk factors. In most countries, existing sources of information are not sufficient or should be revised, and new sources of data should be found. Existing survey systems can meet organizational difficulties in changing their questionnaires; moreover, the number of changes in the adopted questionnaire should be limited for the sake of brevity to avoid excessive burden on respondents. Nevertheless, a limited number of additional variables can offer a lot of information on migrant health. Migrant status, country of origin, time of arrival should be included in any survey concerned about migrant health. These, along with information on other Social Determinants of Health and access to health services, can offer fundamental information to better understand migrants’ health and its evolution as they live in their host countries. Migrants are often characterized by a better health status, in comparison with the native population, which typically is lost over the years. Public health and health promotion could have a relevant role in modifying, for the better, this evolution, but this action must be supported by timely and reliable information.

Article

Convergence Theory and the Salmon Effect in Migrant Health  

Yudit Namer and Oliver Razum

For decades, researchers have been puzzled by the finding that despite low socioeconomic status, fewer social mobility opportunities, and access barriers to health care, some migrant groups appear to experience lower mortality than the majority population of the respective host country (and possibly also of the country of origin). This phenomenon has been acknowledged as a paradox, and in turn, researchers attempted to explain this paradox through theoretical interpretations, innovative research designs, and methodological speculations. Specific focus on the salmon effect/bias and the convergence theory may help characterize the past and current tendencies in migrant health research to explain the paradox of healthy migrants: the first examines whether the paradox reveals a real effect or is a reflection of methodological error, and the second suggests that even if migrants indeed have a mortality advantage, it may soon disappear due to acculturation. These discussions should encompass mental health in addition to physical health. It is impossible to forecast the future trajectories of migration patterns and equally impossible to always accurately predict the physical and mental health outcomes migrants/refugees who cannot return to the country of origin in times of war, political conflict, and severe climate change. However, following individuals on their path to becoming acculturated to new societies will not only enrich our understanding of the relationship between migration and health but also contribute to the acculturation process by generating advocacy for inclusive health care.

Article

History of Languages for Specific Purposes  

Wolfgang Pöckl

It is often said that languages for specific purposes (also named special languages or technolects) are the product of a division of labor. Although this concept was introduced only as late as 1776 (by Adam Smith, in An Inquiry Into the Nature and Causes of the Wealth of Nations), the idea that professions or occupations of all kind are characterized by a particular vocabulary that is not understood by all native speakers was already manifest in the writings of medieval scholars (for instance, in Dante’s De vulgari eloquentia). In the Middle Ages most Romance languages conquered a more or less wide range of domains. The question arose whether they were also appropriate to serve as a medium of scholarship. The disciplines taught at the universities (arts, theology, law, medicine) had a strong Latin tradition; their knowledge was popularized by means of translations, which enriched the vocabulary and the syntactic flexibility of the emerging languages. Thus, the translators—sometimes organized in “schools”—contributed to the elaboration of the target languages and to their emancipation from Latin. Aside from the septem artes liberales, however, a second group of (seven) disciplines without Latin roots (called artes mechanicae) established and introduced mainly native vocabulary typical of the respective occupational fields. During the first centuries of modern times, more and more scholars felt that their mother tongue should take the place of Latin as a means of propagating scholarship and new findings. In the 17th and 18th centuries, French held the lead among the modern languages in nearly all fields of knowledge; it maintained its dominant position among the Romance languages until the second half of the 20th century. On a global level, German was a strong rival in the humanities and several scientific disciplines in the 19th century; for many decades, however, English has been the universal medium of communication in the scientific community. This process has given rise to many discussions about language planning measures to be taken in order to curtail the Anglo-American supremacy. Before the 18th century, special languages did not have a strong impact on the physiognomy of developed languages. In the sphere of academic disciplines, translations of canonical Latin texts entailed a general re-Latinization and, as a consequence, a process of convergence of the Romance languages. The technical languages of trade and artisanry were highly fragmented so that their special vocabulary was used and understood only in limited geographical areas. In the Age of Enlightenment, the growing prestige of experts, on the one hand, and philosophical considerations about the optimization of language(s), on the other hand, led to increasing harmonization efforts on national and supranational levels. Organizations were founded with the purpose of creating and standardizing terminologies for various kinds of subjects (technical products, medicine, etc.). Special languages, far from being homogeneous varieties, are differentiated vertically. Linguists use to distinguish between three levels of communication: specialists inter se (e.g., physician—physician), specialist—skilled worker (physician—nurse), and specialist—layman (physician—patient). Studying how technical terms seep into common language and what changes they undergo during this process is a great challenge for linguists.

Article

Frameworks of Critical Race Theory  

Lee E. Ross

Critical race theory (CRT) concerns the study and transformation of relationships among race, (ethnicity), racism, and power. For many scholars, CRT is a theoretical and interpretative lens that analyzes the appearance of race and racism within institutions and across literature, film, art, and other forms of social media. Unlike traditional civil rights approaches that embraced incrementalism and systematic progress, CRT questioned the very foundations of the legal order. Since the 1980s, various disciplines have relied on this theory—most notably the fields of education, history, legal studies, feminist studies, political science, psychology, sociology, and criminal justice—to address the dynamics and challenges of racism in American society. While earlier narratives may have exclusively characterized the plight of African Americans against institutional power structures, later research has advocated the importance of understanding and highlighting the narratives of all people of color. Moreover, the theoretical lenses of CRT have broadened its spectrum to include frameworks that capture the struggles and experiences of Latinx, Asian, and Native Americans as well. Taken collectively, these can be regarded as critical race studies. Each framework relies heavily on certain principles of CRT, exposing the easily obscured and often racialized power structures of American society. Included among these principles (and related tenets) is white supremacy, white privilege, interest convergence, legal indeterminacy, intersectionality, and storytelling, among others. An examination of each framework reveals its remarkable potential to inform and facilitate an understanding of racialized practices within and across American power structures and institutions, including education, employment, the legal system, housing, and health care.

Article

Romance in Contact With Basque  

Gerd Jendraschek

The convergence between Basque and Romance is now largely unidirectional, with Basque becoming more like Romance, but shared features suggest that Basque had historically a considerable influence on the emerging Romance varieties in southern France and northern Iberia. Similar phonemic distinctions and phonetic realizations are found in adjacent Basque and Romance varieties, and sometimes beyond. The phoneme inventories of Basque and Castilian Spanish are largely identical. The Romance influence on Basque is most visible in the lexicon, as over half of the words used in everyday speech are of Latin or Romance origin. While the Basque contribution to the Romance lexicon of common nouns has been much more modest, some Basque anthroponyms have become very popular beyond the Basque Country. The integration of Latin verbs into the Basque lexicon triggered and then accelerated the switch to a tense-aspect system modeled on that of Romance. Like Spanish, the Basque varieties in Spain distinguish between two ‘be’-copulas, and two ‘have’-verbs. Certain types of relative clauses and passive constructions replicate Romance models, and a Basque mediopassive can be systematically translated into a Spanish clause with the pronoun se. The default constituent order of Basque is verb-final, but dependent clauses are often found in post-predicate position, matching the order found in Romance. While sharing many features with Romance varieties across southwestern Europe, Basque is closest to Castilian and Gascon, the two languages with which it has a long history of bilingualism and localized language shift.