1-7 of 7 Results

  • Keywords: normalization x
Clear all


Global Anomie Theory  

Anamika Twyman-Ghoshal

Global anomie theory (GAT), as articulated by Nikos Passas, provides an explanation of the impact of globalization and neoliberalism on nations and the conditions within them to create anomie resulting in deviance. Drawing on Merton’s anomie theory, GAT includes an analysis of the global structural and cultural forces acting on the relations between society and individuals. The theory is integrative, incorporating anomie with other criminological approaches and with knowledge from related social sciences. GAT is designed to provide a comprehensive macro-level theory on the social context for deviance. The global anomie approach suggests that neoliberal globalization is a root cause of anomie and dysnomie, creating an environment conducive to crime and social harm. The theory posits that the growth and intensity of neoliberalization has multiplied criminogenic asymmetries creating discrepancies between cultural goals and the legitimate means of achieving those goals. The interconnections generated by globalization are manifest through increased social mobility, enhanced international communication, and intensified international trade. This process has been magnified globally, stressing the importance of an unfettered free market, espousing material goals, economic growth, and consumerism. In this environment of growing interconnectedness, reference groups are broadened, which influence aspirations, steering them increasingly toward economic goals. Simultaneously, the process of globalization exposes inequities, stratifications, exclusions, and marginalization, which impede access to the sought-after material goals, creating both absolute and relative deprivation. Echoing Merton’s work, Passas argues that when aspirations are not realized, such blockages lead to systematic frustrations. Individuals adapt to the strain in different ways, some through deviance. Deviant behavior is rationalized under these structural conditions, which when successful and allowed to continue with impunity, becomes established and normative for others in society, including for those that do not experience the original strain. At the same time, the theory identifies the impact of neoliberal globalization on governance. Normative standards and control mechanisms are reduced in an effort to shrink government intervention and oversight; this includes reducing social support mechanisms to make way for a privatized market. The ability of governments to act effectively is further impeded as deviant adaptations become normalized, creating an environment of dysnomie.


Normalization Principles in Computational Neuroscience  

Kenway Louie and Paul W. Glimcher

A core question in systems and computational neuroscience is how the brain represents information. Identifying principles of information coding in neural circuits is critical to understanding brain organization and function in sensory, motor, and cognitive neuroscience. This provides a conceptual bridge between the underlying biophysical mechanisms and the ultimate behavioral goals of the organism. Central to this framework is the question of computation: what are the relevant representations of input and output, and what algorithms govern the input-output transformation? Remarkably, evidence suggests that certain canonical computations exist across different circuits, brain regions, and species. Such computations are implemented by different biophysical and network mechanisms, indicating that the unifying target of conservation is the algorithmic form of information processing rather than the specific biological implementation. A prime candidate to serve as a canonical computation is divisive normalization, which scales the activity of a given neuron by the activity of a larger neuronal pool. This nonlinear transformation introduces an intrinsic contextual modulation into information coding, such that the selective response of a neuron to features of the input is scaled by other input characteristics. This contextual modulation allows the normalization model to capture a wide array of neural and behavioral phenomena not captured by simpler linear models of information processing. The generality and flexibility of the normalization model arises from the normalization pool, which allows different inputs to directly drive and suppress a given neuron, effectively separating information that drives excitation and contextual modulation. Originally proposed to describe responses in early visual cortex, normalization has been widely documented in different brain regions, hierarchical levels, and modalities of sensory processing; furthermore, recent work shows that the normalization extends to cognitive processes such as attention, multisensory integration, and decision making. This ubiquity reinforces the canonical nature of the normalization computation and highlights the importance of an algorithmic framework in linking biological mechanism and behavior.


Historical and Philosophical Foundations of Inclusive Education  

Phil Foreman

Inclusive education is a widely accepted pedagogical and policy principle, but its genesis has been long and, at times, difficult. For example, in 1948, the Universal Declaration of Human Rights included statements about rights and freedoms that have, over the decades, been used to promote inclusive educational practices. Article 26 of the Declaration stated that parents “have a prior right to choose the kind of education that shall be given to their children.” This declaration later helped some parent groups and educators to advocate for equal access to schooling in regular settings, and for parental choice about where their child would be educated. Following the widespread influence of the human rights-based principle of normalization, the concept of inclusive education received major impetus from the Education of All Handicapped Children Act in the United States in 1975, the United Nations (UN) International Year of Disabled Persons in 1981, and the UN Convention on the Rights of Persons with Disabilities in 2006. A major focus of the UN initiatives has been the right of people with a disability to participate fully in society. This focus has obvious consequences for the way education is provided to students with a disability or other additional educational needs. For many years, up to the last quarter of the 20th century, the major focus for such students was on the provision of separate specialized services, with limited attention to the concept of full participation in society. Toward the end of the 20th century and into the 21st century, there has been increasing acceptance, through parental action, systemic policy, and government legislation, of inclusivity as a basic philosophical principle. Both the type of instruction that should be provided to students with a disability and the location of that instruction in regular or specialized settings have been topics for advocacy and research, sometimes with mixed and/or controversial conclusions.


Disability Studies  

Robert McRuer

Disability studies is an interdisciplinary mode of inquiry that flourished beginning in the late 20th century. Disability studies challenges the singularity of dominant models of disability, particularly the medical model that would reduce disability to diagnosis, loss, or lack, and that would insist on cure as the only viable approach to apprehending disability. Disability studies pluralizes ways of thinking about disability, and bodily, mental, or behavioral atypicality in general; it simultaneously questions the ways in which able-bodiedness has been made to appear natural and universal. Disability studies is an analytic that attends to how disability and ability are represented in language and in a wide range of cultural texts, and it is particularly attuned to the ways in which power relations in a culture of normalization have generally subordinated disabled people, particularly in capitalist systems that demand productive and efficient laborers. Disability studies is actively intersectional, drawing on feminist theory, critical race theory, queer theory, and other analytics to consider how gender, race, sexuality, and disability are co-constitutive, always implicated in each other. Crip theory has emerged as a particular mode of doing disability studies that draws on the pride and defiance of crip culture, art, and activism, with crip itself marking both a reclamation of a term designed to wound or demean and as a marker of the fact that bodies and minds do not fit neatly within or beneath a historical able-bodied/disabled binary. “To crip,” as a critical process, entails recognizing how certain bodily and mental experiences have been made pathological, deviant, or perverse and how such experiences have subsequently been marginalized or invisibilized. Queer of color critique, which is arguably at the absolute center of the project of queer theory, shares a great deal with crip theory, as it consistently points outward to the relations of power that constitute and reconstitute the social. Queer of color critique focuses on processes of racialization and gendering that make certain groups perverse or pathological. Although the ways in which this queer of color project overlaps significantly with disability studies and crip theory have not always been acknowledged, vibrant modes of crip of color critique have emerged in the 21st century, making explicit the connections.


Speech Perception and Generalization Across Talkers and Accents  

Kodi Weatherholtz and T. Florian Jaeger

The seeming ease with which we usually understand each other belies the complexity of the processes that underlie speech perception. One of the biggest computational challenges is that different talkers realize the same speech categories (e.g., /p/) in physically different ways. We review the mixture of processes that enable robust speech understanding across talkers despite this lack of invariance. These processes range from automatic pre-speech adjustments of the distribution of energy over acoustic frequencies (normalization) to implicit statistical learning of talker-specific properties (adaptation, perceptual recalibration) to the generalization of these patterns across groups of talkers (e.g., gender differences).


Speech Perception in Phonetics  

Patrice Speeter Beddor

In their conversational interactions with speakers, listeners aim to understand what a speaker is saying, that is, they aim to arrive at the linguistic message, which is interwoven with social and other information, being conveyed by the input speech signal. Across the more than 60 years of speech perception research, a foundational issue has been to account for listeners’ ability to achieve stable linguistic percepts corresponding to the speaker’s intended message despite highly variable acoustic signals. Research has especially focused on acoustic variants attributable to the phonetic context in which a given phonological form occurs and on variants attributable to the particular speaker who produced the signal. These context- and speaker-dependent variants reveal the complex—albeit informationally rich—patterns that bombard listeners in their everyday interactions. How do listeners deal with these variable acoustic patterns? Empirical studies that address this question provide clear evidence that perception is a malleable, dynamic, and active process. Findings show that listeners perceptually factor out, or compensate for, the variation due to context yet also use that same variation in deciding what a speaker has said. Similarly, listeners adjust, or normalize, for the variation introduced by speakers who differ in their anatomical and socio-indexical characteristics, yet listeners also use that socially structured variation to facilitate their linguistic judgments. Investigations of the time course of perception show that these perceptual accommodations occur rapidly, as the acoustic signal unfolds in real time. Thus, listeners closely attend to the phonetic details made available by different contexts and different speakers. The structured, lawful nature of this variation informs perception. Speech perception changes over time not only in listeners’ moment-by-moment processing, but also across the life span of individuals as they acquire their native language(s), non-native languages, and new dialects and as they encounter other novel speech experiences. These listener-specific experiences contribute to individual differences in perceptual processing. However, even listeners from linguistically homogenous backgrounds differ in their attention to the various acoustic properties that simultaneously convey linguistically and socially meaningful information. The nature and source of listener-specific perceptual strategies serve as an important window on perceptual processing and on how that processing might contribute to sound change. Theories of speech perception aim to explain how listeners interpret the input acoustic signal as linguistic forms. A theoretical account should specify the principles that underlie accurate, stable, flexible, and dynamic perception as achieved by different listeners in different contexts. Current theories differ in their conception of the nature of the information that listeners recover from the acoustic signal, with one fundamental distinction being whether the recovered information is gestural or auditory. Current approaches also differ in their conception of the nature of phonological representations in relation to speech perception, although there is increasing consensus that these representations are more detailed than the abstract, invariant representations of traditional formal phonology. Ongoing work in this area investigates how both abstract information and detailed acoustic information are stored and retrieved, and how best to integrate these types of information in a single theoretical model.


US-Vietnam Relations  

Amanda C. Demmer

It is a truism in the history of warfare that the victors impose the terms for postwar peace. The Vietnam War, however, stands as an exception to this general rule. There can be no doubt that with its capture of the former South Vietnamese capitol on April 30, 1975, the Democratic Republic of Vietnam won unequivocal military victory. Thereafter, the North achieved its longtime goal of reuniting the two halves of Vietnam into a new nation, the Socialist Republic of Vietnam (SRV), governed from Hanoi. These changes, however, did not alter the reality that, despite its military defeat, the United States still wielded a preponderant amount of power in global geopolitics. This tension between the war’s military outcome and the relatively unchanged asymmetry of power between Washington and Hanoi, combined with the passion the war evoked in both countries, created a postwar situation that was far from straightforward. In fact, for years the relationship between the former adversaries stood at an uneasy state, somewhere between war and peace. Scholars call this process by which US-Vietnam relations went from this nebulous state to more regular bilateral ties “normalization.” Normalization between the United States and Vietnam was a protracted, highly contentious process. Immediately after the fall of Saigon, the Gerald Ford administration responded in a hostile fashion by extending the economic embargo that the United States had previously imposed on North Vietnam to the entire country, refusing to grant formal diplomatic recognition to the SRV, and vetoing the SRV’s application to the United Nations. Briefly in 1977 it seemed as though Washington and Hanoi might achieve a rapid normalization of relations, but lingering wartime animosity, internal dynamics in each country, regional transformations in Southeast Asia, and the reinvigoration of the Cold War on a global scale scuttled the negotiations. Between the fall of 1978 and late 1991, the United States refused to have formal normalization talks with Vietnam, citing the Vietnamese occupation of Cambodia and the need to obtain a “full accounting” of missing American servicemen. In these same years, however, US-Vietnamese relations remained far from frozen. Washington and Hanoi met in a series of multilateral and bilateral forums to address the US quest to account for missing American servicemen and an ongoing refugee crisis in Southeast Asia. Although not a linear process, these discussions helped lay the personal and institutional foundations for US-Vietnamese normalization. Beginning in the late 1980s, internal, regional, and international transformations once again rapidly altered the larger geopolitical context of US-Vietnamese normalization. These changes led to the resumption of formal economic and diplomatic relations in 1994 and 1995, respectively. Despite this tangible progress, however, the normalization process continued. After 1995 the economic, political, humanitarian, and defense aspects of bilateral relations increased cautiously but significantly. By the first decade of the 21st century, US-Vietnamese negotiations in each of these areas had accelerated considerably.