61-70 of 376 Results

Article

Vanessa L. Burrows

Stress has not always been accepted as a legitimate medical condition. The biomedical concept stress grew from tangled roots of varied psychosomatic theories of health that examined (a) the relationship between the mind and the body, (b) the relationship between an individual and his or her environment, (c) the capacity for human adaptation, and (d) biochemical mechanisms of self-preservation, and how these functions are altered during acute shock or chronic exposure to harmful agents. From disparate 19th-century origins in the fields of neurology, psychiatry, and evolutionary biology, a biological disease model of stress was originally conceived in the mid-1930s by Canadian endocrinologist Hans Selye, who correlated adrenocortical functions with the regulation of chronic disease. At the same time, the mid-20th-century epidemiological transition signaled the emergence of a pluricausal perspective of degenerative, chronic diseases such as cancer, heart disease, and arthritis that were not produced not by a specific etiological agent, but by a complex combination of multiple factors which contributed to a process of maladaptation that occurred over time due to the conditioning influence of multiple risk factors. The mass awareness of the therapeutic impact of adrenocortical hormones in the treatment of these prevalent diseases offered greater cultural currency to the biological disease model of stress. By the end of the Second World War, military neuropsychiatric research on combat fatigue promoted cultural acceptance of a dynamic and universal concept of mental illness that normalized the phenomenon of mental stress. This cultural shift encouraged the medicalization of anxiety which stimulated the emergence of a market for anxiolytic drugs in the 1950s and helped to link psychological and physiological health. By the 1960s, a growing psychosomatic paradigm of stress focused on behavioral interventions and encouraged the belief that individuals could control their own health through responsible decision-making. The implication that mental power can affect one’s physical health reinforced the psycho-socio-biological ambiguity that has been an enduring legacy of stress ever since. This article examines the medicalization of stress—that is, the historical process by which stress became medically defined. It spans from the mid-19th century to the mid-20th century, focusing on these nine distinct phases: 1. 19th-century psychosomatic antecedent disease concepts 2. The emergence of shell-shock as a medical diagnosis during World War I 3. Hans Selye’s theorization of the General Adapation Syndrome in the 1930s 4. neuropsychiatric research on combat stress during World War II 5. contemporaneous military research on stress hormones during World War II 6. the emergence of a risk factor model of disease in the post-World War II era 7. the development of a professional cadre of stress researchers in the 1940s and 50s 8. the medicalization of anxiety in the early post–World War II era 9. The popularization of stress in the 1950s and pharmaceutical treatments for stress, marked by the cultural assimilation of paradigmatic stress behaviors and deterrence strategies, as well pharmaceutical treatments for stress.

Article

The role of experience in brain organization and function can be studied by systematically manipulating developmental experiences. The most common protocols use extremes in experiential manipulation, such as environmental deprivation and/or enrichment. Studies of the effects of deprivation range from laboratory studies in which animals are raised in the absence of sensory or social experiences from infancy to children raised in orphanages with limited caregiver interaction. In both cases there are chronic perceptual, cognitive, and social dsyfunctions that are associated with chronic changes in neuronal structure and connectivity. Deprivation can be more subtle too, such as being raised in a low socioeconomic environment, which is often associated with poverty. Such experience is especially detrimental to language development, which in turn, limits educational opportunities. Unfortunately, the effects of some forms of socioemotional deprivation are often difficult, if not impossible, to ameliorate. In contrast, adding sensory or social experiences can enhance behavioral functions. For example, placing animals in environments that are cognitively, motorically, and/or socially more complex than standard laboratory housing is associated with neuronal changes that are correlated with superior functions. Enhanced sensory experiences can be relatively subtle, however. For example, tactile stimulation with a soft brush for 15 minutes, three times daily for just two weeks in infant rats leads to permanent improvement in a wide range of psychological functions, including motoric, mnemonic, and other cognitive functions. Both complex environments and sensory stimulation can also reverse the negative effects of many other experiences. Thus, tactile stimulation accelerates discharge from hospital for premature human infants and stimulates recovery from stroke in both infant and adult rats. In sum, brain and behavioral functions are exquisitely influenced by manipulation of sensory experiences, especially in development.

Article

Liane Gabora

Creativity is perhaps what most differentiates humans from other species. Understanding creativity is particularly important in times of accelerated cultural and environmental change, such as the present, in which novel approaches and perspectives are needed. The study of creativity is an exciting area that brings together many different branches of research: cognitive psychology, social psychology, personality psychology, developmental psychology, organizational psychology, clinical psychology, neuroscience, mathematical models, and computer simulations. The creative process is thought to involve the capacity to shift between divergent and convergent modes of thought in response to task demands. Divergent thought is conventionally characterized as the kind of thinking needed for open-ended tasks, and it is measured by the ability to generate multiple solutions, while convergent thought is commonly characterized as the kind of thinking needed for tasks in which there is only one correct solution. More recently, divergent thought has been conceived of as reflecting on the task from unconventional contexts or perspectives, while convergent thought has been conceived of as reflecting on it from conventional contexts or perspectives. Personality traits correlated with creativity include openness to experience, tolerance of ambiguity, impulsivity, and self-confidence. Evidence that creativity is linked with affective disorders is mixed. Neuroscientific research on creativity using electroencephalography (EEG) or functional magnetic resonance imaging (fMRI) suggests that creativity is associated with a loosening of cognitive control and decreased arousal. It has been shown that the distributed, content-addressable structure of associative memory is conducive to bringing task-relevant items to mind without the need for explicit search. Tangible evidence of human creativity dates back to the earliest stone tools devised over three million years ago, with the Middle-Upper Paleolithic marking the onset of art, science, and religion, and another surge of creativity in the present. Past and current areas of controversy concern the relative contributions of expertise, chance, and intuition, whether the emphasis should be on process versus product, whether creativity is a domain-specific or a domain-general function, the extent to which creativity is correlated with affective disorders, and whether divergent thinking entails the generation of multiple ideas or the honing of a single initially ambiguous mental representation that may manifest as different external outputs. Promising areas for further psychological study of creativity include computational modeling, research on the biological basis of creativity, and studies that track specific creative ideation processes over time.

Article

Martin J. Packer and Michael Cole

There is growing appreciation of the role of culture in children’s psychological development (also called human ontogenesis). However, there are several distinct approaches to research on this matter. Cross-cultural psychology explores the causal influence of culture on differences in children’s development, treated as dependent variables. Researchers interested in the role of cultural learning in human evolution view culture as beliefs and values that are transferred from the mind of one individual to that of another. By contrast, “cultural psychology” views culture not as a cause, but a constituent of human psychological functioning. It invites us to pay attention to the fact that humans live in societies filled with material artifacts, tools, and signs that mediate human activity; that is to say, they provide the means with which people interact with the world around them and with one another. From this perspective, culture provides constituents that are essential to human development: it has a constitutive role in development. Although there continues to be much debate over how to define culture, it is generally agreed that different human social groups have distinct cultures, and it is common to assume that cultural differences lead to differences in the trajectories of children’s development. This is true, but it is also the case that culture is a universal requirement for development. Every child is born into a family and community with a language, customs, and conventions, and in which people occupy institutional roles with rights and responsibilities. These facts define universal requisites of human psychological development and include the acquisition of language, the development of a social identity, the understanding of community obligations, and the ability to contribute to the reproduction of the community. The interdependence of human communities—which probably had its origins in collaborative foraging and cooperative childrearing—seems to have placed species-specific demands on children’s development, selecting for the capacity to acquire a sensitivity not only to people’s goals and intentions but also to rights and responsibilities.

Article

Joanne R. Smith

As social animals, humans are strongly influenced by the opinions and actions of those around them. Group norms are the expectations and behaviors associated with a social group, such as a nationality, an organization, or a sports team. Group norms can emerge during group interaction as group members are exposed to the opinions, or observe the actions, of fellow group members. Group norms can also emerge by comparing the attitudes and actions of the group with other groups. Leaders can also influence what is seen to be acceptable behaviors for group members to exhibit. One of the most dominant approaches to the study of group norms is the social identity approach. The social identity approach proposes that belonging to a social group provides individuals with a definition of who one is, and a description and prescription of what is involved in being a group member. A large body of research has confirmed the power of group norms to determine the form and direction of group members’ attitudes and actions, particularly those individuals strongly attached to the group, across many behavioral domains. In thinking about group norms, it is important to recognize that norms have both prescriptive (i.e., what should be done) and descriptive (i.e., what is done) elements. Research has found that group norms are most influential when aligned, but that misaligned or conflicting norms—either within the group or across multiple groups to which an individual belongs—can be particularly harmful in terms of engagement in a desired behavior. It is critical to appreciate and understand these complexities to be able to change group norms and, therefore, group members’ actions. The insight that group norms are powerful determinants of behavior has been incorporated into behavior change interventions, including so-called “nudge” interventions. However, norms-based campaigns are not always successful, and can even lead to backlash effects, often because change agents have failed to consider identity-related processes, such as the role of leaders, the source of the influence attempt, and threats arising from attempts to change one’s group. Shared identity is a key mechanism through which people internalize (new) understandings of what it means to be a group member into the self-concept, and understanding these processes may lead to more enduring change in underlying motives, beliefs, and behavior.

Article

Hunger  

Neil E. Rowland

Hunger is a specific and compelling sensation, sometimes arising from internal signals of nutrient depletion but more often modulated by numerous environmental variables including taste or palatability and ease or cost of procurement. Hunger motivates appetitive or foraging behaviors to find food followed by appropriate proximate or consummatory behaviors to eat it. A critical concept underlying food intake is the flux of chemical energy through an organism. This starts with inputs of food with particular energy content, storage of excess energy as adipose tissue or glycogen, and finally energy expenditure as resting metabolic rate (RMR) or as metabolic rate is modified by physical activity. These concepts are relevant within the context of adequate theoretical accounts based in energy homeostasis; historically, these are mainly static models, although it is now clear that these do not address practical issues such as weight gain through life. Eating is essentially an episodic behavior, often clustered as meals, and this has led to the idea that the meal is a central theoretical concept, but demonstrations that meal patterns are greatly influenced by the environment present a challenge to this tenet. Patterns of eating acquired during infancy and early life may also play a role in establishing adult norms. Direct controls of feeding are those that emphasize food itself as generating internal signals to modify or terminate an ongoing bout of eating, and include a variety of enteroendocrine hormones and brainstem mechanisms. Additionally, many studies point to the essential rewarding or hedonic aspects of food intake, including palatability, and this may involve integrative mechanisms in the forebrain and cerebral cortex.

Article

Thomas F. Pettigrew

Intergroup attribution refers to causal attributions that people make about the behavior of out-groups and their own in-group. Attribution theory began in the late 1950s and 1960s. This initial interest was limited to how individuals causally interpreted the behavior of other individuals. But in the 1970s social psychologists began to consider causal attributions made about groups. The guiding theory for research in this area has been largely structured by the predictions of the ultimate attribution error (more accurately described as the intergroup attribution bias). Its principal contentions flow from phenomena already uncovered by attribution research on individual behavior. It holds that group attributions, especially among the highly prejudiced, will be biased for the in-group and against out-groups. Ingroup protection (explaining away negative ingroup behavior as situationally determined – “given the situation, we had to act that way”) is typically a stronger effect than ingroup enhancement (accepting positive ingroup behavior as dispositionally determined – “as a people, we are kind and compassionate toward other groups”). Many moderators and mediators of the effect have been uncovered. Asian cultures, for example, tend to be less prone to the intergroup attribution bias, while strong emotions can induce either more or less of the bias. Similarly, empathy and special training can significantly reduce the bias. Together with such closely related processes as the fundamental attribution error and actor-observer asymmetry, the intergroup attribution bias has proven highly useful in a great variety of applications. Moreover, the intergroup attribution bias serves as an integral component of the intergroup prejudice syndrome.

Article

Mind cure, or mental healing, was a late 19th-century American healing movement that extolled a metaphysical mind-over-matter approach to the treatment of illness. Emerging in New England in the mid-19th century out of a mix of mesmerism and metaphysical philosophies, due to its effectiveness, by the 1880s it achieved national recognition. Three individuals are credited with creating and popularizing mental (or metaphysical) healing: Phineas Parkhurst Quimby, Warren Felt Evans, and Mary Baker Eddy. Mind cure was appealing because it helped treat ailments for which the medicines of the day were ineffective, especially problems with the “nerves.” Mental healers employed non-invasive mental and spiritual methods to treat ailing people, called mental therapeutics. As a practice and therapeutic philosophy, mind cure is historically noteworthy because it shaped the earliest forms of psychotherapy in the United States, advanced therapeutic work within the realm of mind-body medicine, birthed the influential New Thought movement, and helped set the stage for the beginnings of religious pluralism and the positive reception of Asian meditation teachers in the West.

Article

The use of psychological concepts and data to promote ideas of an enduring racial hierarchy dates from the late 1800s and has continued to the present. The history of scientific racism in psychology is intertwined with broader debates, anxieties, and political issues in American society. With the rise of intelligence testing, joined with ideas of eugenic progress and dysgenic reproduction, psychological concepts and data came to play an important role in naturalizing racial inequality. Although racial comparisons were not the primary concern of most early mental testing, results were employed to justify beliefs regarding Black “educability” and the dangers of Southern and Eastern European immigration. Mainstream American psychology became increasingly liberal and anti-racist in the late 1930s and after World War II. However, scientific racism did not disappear and underwent renewal during the civil rights era and again during the 1970s and 1990s, Intelligence test scores were a primary weapon in attempts to preserve segregated schools and later to justify economic inequality. In the case of Henry Garrett, Arthur Jensen, and Philippe Rushton, their work included active, public promotion of their ideas of enduring racial differences, and involvement with publications and groups under control of racial extremists and neo-Nazis. Despite 100 years of strong critiques of scientific racism, a small but active group of psychologists helped revive vicious 19th-century claims regarding Black intelligence, brain size, morality, criminality, and sexuality, presented as detached scientific facts. These new claims were used in popular campaigns that aimed to eliminate government programs, promote racial separation, and increase immigration restriction. This troubling history raises important ethical questions for the discipline.

Article

Susan C. Baker, Bernadette M. Watson, and Cindy Gallois

Language is a social behavior and a key aspect of social interaction. Language is ubiquitous and usually occurs with other human behaviors across diverse contexts. Thus, it is difficult to study it in isolation. This difficulty may be why most, albeit not all, social psychologists tend to neglect language, in spite of the prominence of language in early 20th century social psychology and the presence of numerous handbooks and reviews of this area. Language use has implications for many social psychological processes, and, given its role in daily social life, it is important to understand its social underpinnings. The field of language and social psychology highlights the relationship between language and communication and foregrounds the differences between the social-psychological and communication approaches. One central issue is bilingualism and the relationships among language, identity, and culture. Another is methodology, where social psychologists have tended to choose experimental and survey strategies to look at language (not always to the best advantage). This century has seen the development of new technologies that allow us to look at language on a large scale and in rich detail and that have the potential to transform this research. In part as a consequence, in the early 21st century there are many new topics emerging in language and social psychology that help to set a new agenda for future research.