You are looking at 51-60 of 371 articles
The first Italian social psychologies showed a pluralism of perspectives that disappeared in the subsequent development of the discipline. With the presence of a collective sociological psychology (SP), a philosophical SP, and a psychological SP rooted in the sociocentric dimension, the field appeared variously articulated with a negotiation and a dialogue between different disciplinary approaches for the construction of its identity. This dialogue was destined to be swept away, first, during the fascist period, and then in 1954, with the affirmation of a psychological and experimental SP, sanctioned by the first National Congress of SP. However, in Italy, unlike in the United States, SP maintained strong social roots. These roots had already been evident from the end of the 19th century to the beginning of the 20th century, when three central topics for SP were emerging in Europe: crowd psychology, psychology of public opinion, and race psychology. Each of these topics played a particular role under the totalitarian regimes. In Italy, Antonio Miotto and Paolo Orano were the scholars who dealt with these three themes, developing them to different degrees of involvement with the fascist regime. Antonio Miotto remained relatively autonomous from the political lines dictated by fascism. Thus, he articulated an original positive conception of the crowd, contrasting the vision of passive masses to maneuver in ways typical of fascism. He did not express himself in favor of or against the censorship of the media and the control of public opinion, and only after fascism took hold did he reflect on the role of political propaganda, analyzing examples from totalitarian regimes. He avoided taking strong and clear positions on the theme of race, although a few of his statements on the subject were completely in line with the regime’s racist ideology. Orano, by contrast, had a marginal interest in crowds, sharing the negative prejudice typical of the conservative crowd psychology. However, Orano had a great deal to say on the role of public opinion. His thoughts developed along the lines of fascist totalitarian policy. He was one of the protagonists of this field, and in 1938 he founded the first Italian center of study of public opinion (Demodoxalogy Center). He created the center with the aim of knowing public opinion, guiding it, and controlling it. With respect to the theme of race, Orano was also completely involved in the fascist racist ideology, devoting considerable energy and framing his original contribution according to the historiographic point of view defined as “national racism.” Yet the development of SP that occurred after World War II showed no traces of these different forms of social psychologies and their role during the fascist regime. Postwar Italian social psychology completely removed the contribution of these two psychologists. Only recently has the prewar social psychology begun to be analyzed by a critical history centered on both disciplinary and sociocultural contexts.
Stanley Milgram’s experiments on obedience to authority are among the most influential and controversial social scientific studies ever conducted. They remain staples of introductory psychology courses and textbooks, yet their influence reaches far beyond psychology, with myriad other disciplines finding lessons in them. Indeed, the experiments have long since broken free of the confines of academia, occupying a place in popular culture that is unrivaled among psychological experiments. The present article begins with an overview of Milgram’s account of his experimental procedure and findings, before focussing on recent scholarship that has used materials from Milgram’s archive to challenge many of the long-held assumptions about the experiments. Three areas in which our understanding of the obedience experiments has undergone a radical shift in recent years are the subject of particular focus. First, work that has identified new ethical problems with Milgram’s studies is summarized. Second, hitherto unknown methodological variations in Milgram’s experimental procedures are considered. Third, the interactions that took place in the experimental sessions themselves are explored. This work has contributed to a shift in how we see the obedience experiments. Rather than viewing the experiments as demonstrations of people’s propensity to follow orders, it is now clear that people did not follow orders in Milgram’s experiments. The experimenter did a lot more than simply issue orders, and when he did, participants found it relatively straightforward to defy them. These arguments are discussed in relation to the definition of obedience that has typically been adopted in psychology, the need for further historical work on Milgram’s experiments, and the possibilities afforded by the development of a broader project of secondary qualitative analysis of laboratory interaction in psychology experiments.
Barbara A. Wilson
Neuropsychological rehabilitation (NR) is concerned with the amelioration of deficits caused by insult to the brain. It adopts a goal-planning approach and addresses real-life difficulties. Neuropsychology studies how the brain affects behavior, emotion, and cognition. Rehabilitation is a process whereby people who are disabled work together with professional staff, relatives, and others to achieve optimum physical, psychological, and vocational well-being. Rehabilitation is not synonymous with recovery, nor is it treatment. It is a two-way interactive process with professional staff and others who aim to remediate or alleviate difficulties, adopting a holistic approach in which cognition, emotion, and psychosocial problems are treated together, aided by an increasing use of technological aids.
NR enables people with disabilities to achieve their optimum level of well-being, reduce problems in everyday life, and help them return to the most appropriate environments. There may also be some partial or limited recovery of function and certainly some substitution of function. Accepting that return of normal functioning is highly unlikely, rehabilitation finds ways to help people learn more efficiently, compensate for their difficulties, and, when necessary, modify the environment.
While theoretical models have proved helpful, indeed essential, in identifying cognitive strengths and weaknesses, in explaining phenomena, and in making predictions about behavior, they are insufficient, on their own, to seriously influence rehabilitation aimed at making lives more adaptable to problems encountered in everyday living. NR should focus on goals relevant to a person’s individual everyday life, it should be implemented in the environment where the person lives, and have personally meaningful themes, activities, settings, and interactions.
We know from numerous studies that NR can be clinically effective. Although rehabilitation can be expensive in the short term, there is evidence that it is cost-effective in the long term.
Sebastian E. Bartos
Both academic and lay definitions of sex vary. However, definitions generally gravitate around reproduction and the experience of pleasure. Some theoretical approaches, such as psychoanalysis and evolutionary psychology, have positioned sexuality at the center of psychological phenomena. Much research has also linked sex to health and disease. On the one hand, certain sexual thoughts, feelings, behaviors, and identities have been described as pathological. Over time, some of these have been accepted as normal (especially homosexuality), while new forms of pathology have also been proposed (e.g., “porn addiction”). On the other hand, some aspects of sexuality are being researched due to their relevance to public health (e.g., sex education) or to counseling (e.g., assisted reproduction). Sex research has always been controversial, paradoxically receiving both positive attention and disdain. These contradictory social forces have arguably affected both the content and the scientific quality of sex research.
Michael J. Zickar
Personnel and vocational testing has made a huge impact in public and private organizations by helping organizations choose the best employees for a particular job (personnel testing) and helping individuals choose occupations for which they are best suited (vocational testing). The history of personnel and vocational testing is one in which scientific advances were influenced by historical and technological developments.
The first systematic efforts at personnel and vocational testing began during World War I when the US military needed techniques to sort through a large number of applicants in a short amount of time. Techniques of psychological testing had just begun to be developed at around the turn of the 20th century and those techniques were quickly applied to the US military effort. After the war, intelligence and personality tests were used by business organizations to help choose applicants most likely to succeed in their organizations. In addition, when the Great Depression occurred, vocational interest tests were used by government organizations to help the unemployed choose occupations that they might best succeed in.
The development of personnel and vocational tests was greatly influenced by the developing techniques of psychometric theory as well as general statistical theory. From the 1930s onward, significant advances in reliability and validity theory provided a framework for test developers to be able to develop tests and validate them. In addition, the civil rights movement within the United States, and particularly the Civil Rights Act of 1964, forced test developers to develop standards and procedures to justify test usage. This legislation and subsequent court cases ensured that psychologists would need to be involved deeply in personnel testing. Finally, testing in the 1990s onward was greatly influenced by technological advances. Computerization helped standardize administration and scoring of tests as well as opening up the possibility for multimedia item formats. The introduction of the internet and web-based testing also provided additional challenges and opportunities.
Vanessa L. Burrows
Stress has not always been accepted as a legitimate medical condition. The biomedical concept stress grew from tangled roots of varied psychosomatic theories of health that examined (a) the relationship between the mind and the body, (b) the relationship between an individual and his or her environment, (c) the capacity for human adaptation, and (d) biochemical mechanisms of self-preservation, and how these functions are altered during acute shock or chronic exposure to harmful agents. From disparate 19th-century origins in the fields of neurology, psychiatry, and evolutionary biology, a biological disease model of stress was originally conceived in the mid-1930s by Canadian endocrinologist Hans Selye, who correlated adrenocortical functions with the regulation of chronic disease.
At the same time, the mid-20th-century epidemiological transition signaled the emergence of a pluricausal perspective of degenerative, chronic diseases such as cancer, heart disease, and arthritis that were not produced not by a specific etiological agent, but by a complex combination of multiple factors which contributed to a process of maladaptation that occurred over time due to the conditioning influence of multiple risk factors. The mass awareness of the therapeutic impact of adrenocortical hormones in the treatment of these prevalent diseases offered greater cultural currency to the biological disease model of stress.
By the end of the Second World War, military neuropsychiatric research on combat fatigue promoted cultural acceptance of a dynamic and universal concept of mental illness that normalized the phenomenon of mental stress. This cultural shift encouraged the medicalization of anxiety which stimulated the emergence of a market for anxiolytic drugs in the 1950s and helped to link psychological and physiological health. By the 1960s, a growing psychosomatic paradigm of stress focused on behavioral interventions and encouraged the belief that individuals could control their own health through responsible decision-making. The implication that mental power can affect one’s physical health reinforced the psycho-socio-biological ambiguity that has been an enduring legacy of stress ever since.
This article examines the medicalization of stress—that is, the historical process by which stress became medically defined. It spans from the mid-19th century to the mid-20th century, focusing on these nine distinct phases:
1. 19th-century psychosomatic antecedent disease concepts
2. The emergence of shell-shock as a medical diagnosis during World War I
3. Hans Selye’s theorization of the General Adapation Syndrome in the 1930s
4. neuropsychiatric research on combat stress during World War II
5. contemporaneous military research on stress hormones during World War II
6. the emergence of a risk factor model of disease in the post-World War II era
7. the development of a professional cadre of stress researchers in the 1940s and 50s
8. the medicalization of anxiety in the early post–World War II era
9. The popularization of stress in the 1950s and pharmaceutical treatments for stress, marked by the cultural assimilation of paradigmatic stress behaviors and deterrence strategies, as well pharmaceutical treatments for stress.
The role of experience in brain organization and function can be studied by systematically manipulating developmental experiences. The most common protocols use extremes in experiential manipulation, such as environmental deprivation and/or enrichment. Studies of the effects of deprivation range from laboratory studies in which animals are raised in the absence of sensory or social experiences from infancy to children raised in orphanages with limited caregiver interaction. In both cases there are chronic perceptual, cognitive, and social dsyfunctions that are associated with chronic changes in neuronal structure and connectivity. Deprivation can be more subtle too, such as being raised in a low socioeconomic environment, which is often associated with poverty. Such experience is especially detrimental to language development, which in turn, limits educational opportunities. Unfortunately, the effects of some forms of socioemotional deprivation are often difficult, if not impossible, to ameliorate.
In contrast, adding sensory or social experiences can enhance behavioral functions. For example, placing animals in environments that are cognitively, motorically, and/or socially more complex than standard laboratory housing is associated with neuronal changes that are correlated with superior functions. Enhanced sensory experiences can be relatively subtle, however. For example, tactile stimulation with a soft brush for 15 minutes, three times daily for just two weeks in infant rats leads to permanent improvement in a wide range of psychological functions, including motoric, mnemonic, and other cognitive functions. Both complex environments and sensory stimulation can also reverse the negative effects of many other experiences. Thus, tactile stimulation accelerates discharge from hospital for premature human infants and stimulates recovery from stroke in both infant and adult rats. In sum, brain and behavioral functions are exquisitely influenced by manipulation of sensory experiences, especially in development.
Creativity is perhaps what most differentiates humans from other species. Understanding creativity is particularly important in times of accelerated cultural and environmental change, such as the present, in which novel approaches and perspectives are needed. The study of creativity is an exciting area that brings together many different branches of research: cognitive psychology, social psychology, personality psychology, developmental psychology, organizational psychology, clinical psychology, neuroscience, mathematical models, and computer simulations. The creative process is thought to involve the capacity to shift between divergent and convergent modes of thought in response to task demands. Divergent thought is conventionally characterized as the kind of thinking needed for open-ended tasks, and it is measured by the ability to generate multiple solutions, while convergent thought is commonly characterized as the kind of thinking needed for tasks in which there is only one correct solution. More recently, divergent thought has been conceived of as reflecting on the task from unconventional contexts or perspectives, while convergent thought has been conceived of as reflecting on it from conventional contexts or perspectives. Personality traits correlated with creativity include openness to experience, tolerance of ambiguity, impulsivity, and self-confidence. Evidence that creativity is linked with affective disorders is mixed. Neuroscientific research on creativity using electroencephalography (EEG) or functional magnetic resonance imaging (fMRI) suggests that creativity is associated with a loosening of cognitive control and decreased arousal. It has been shown that the distributed, content-addressable structure of associative memory is conducive to bringing task-relevant items to mind without the need for explicit search. Tangible evidence of human creativity dates back to the earliest stone tools devised over three million years ago, with the Middle-Upper Paleolithic marking the onset of art, science, and religion, and another surge of creativity in the present. Past and current areas of controversy concern the relative contributions of expertise, chance, and intuition, whether the emphasis should be on process versus product, whether creativity is a domain-specific or a domain-general function, the extent to which creativity is correlated with affective disorders, and whether divergent thinking entails the generation of multiple ideas or the honing of a single initially ambiguous mental representation that may manifest as different external outputs. Promising areas for further psychological study of creativity include computational modeling, research on the biological basis of creativity, and studies that track specific creative ideation processes over time.
Martin J. Packer and Michael Cole
There is growing appreciation of the role of culture in children’s psychological development (also called human ontogenesis). However, there are several distinct approaches to research on this matter. Cross-cultural psychology explores the causal influence of culture on differences in children’s development, treated as dependent variables. Researchers interested in the role of cultural learning in human evolution view culture as beliefs and values that are transferred from the mind of one individual to that of another.
By contrast, “cultural psychology” views culture not as a cause, but a constituent of human psychological functioning. It invites us to pay attention to the fact that humans live in societies filled with material artifacts, tools, and signs that mediate human activity; that is to say, they provide the means with which people interact with the world around them and with one another. From this perspective, culture provides constituents that are essential to human development: it has a constitutive role in development.
Although there continues to be much debate over how to define culture, it is generally agreed that different human social groups have distinct cultures, and it is common to assume that cultural differences lead to differences in the trajectories of children’s development. This is true, but it is also the case that culture is a universal requirement for development. Every child is born into a family and community with a language, customs, and conventions, and in which people occupy institutional roles with rights and responsibilities. These facts define universal requisites of human psychological development and include the acquisition of language, the development of a social identity, the understanding of community obligations, and the ability to contribute to the reproduction of the community. The interdependence of human communities—which probably had its origins in collaborative foraging and cooperative childrearing—seems to have placed species-specific demands on children’s development, selecting for the capacity to acquire a sensitivity not only to people’s goals and intentions but also to rights and responsibilities.
Joanne R. Smith
As social animals, humans are strongly influenced by the opinions and actions of those around them. Group norms are the expectations and behaviors associated with a social group, such as a nationality, an organization, or a sports team. Group norms can emerge during group interaction as group members are exposed to the opinions, or observe the actions, of fellow group members. Group norms can also emerge by comparing the attitudes and actions of the group with other groups. Leaders can also influence what is seen to be acceptable behaviors for group members to exhibit.
One of the most dominant approaches to the study of group norms is the social identity approach. The social identity approach proposes that belonging to a social group provides individuals with a definition of who one is, and a description and prescription of what is involved in being a group member. A large body of research has confirmed the power of group norms to determine the form and direction of group members’ attitudes and actions, particularly those individuals strongly attached to the group, across many behavioral domains.
In thinking about group norms, it is important to recognize that norms have both prescriptive (i.e., what should be done) and descriptive (i.e., what is done) elements. Research has found that group norms are most influential when aligned, but that misaligned or conflicting norms—either within the group or across multiple groups to which an individual belongs—can be particularly harmful in terms of engagement in a desired behavior. It is critical to appreciate and understand these complexities to be able to change group norms and, therefore, group members’ actions.
The insight that group norms are powerful determinants of behavior has been incorporated into behavior change interventions, including so-called “nudge” interventions. However, norms-based campaigns are not always successful, and can even lead to backlash effects, often because change agents have failed to consider identity-related processes, such as the role of leaders, the source of the influence attempt, and threats arising from attempts to change one’s group. Shared identity is a key mechanism through which people internalize (new) understandings of what it means to be a group member into the self-concept, and understanding these processes may lead to more enduring change in underlying motives, beliefs, and behavior.