461-480 of 501 Results

Article

With roots that range from medicine to politics, to jurisdiction and historiography in ancient Greece, the concept of “crisis” played an eminent role in the founding years of Western academic psychology and continued to be relevant during its development in the 19th and 20th century. “Crisis” conveys the idea of an imminent danger of disintegration and breakdown, as well as a pivotal turning point with the chance of a new beginning. To this day, both levels of meaning are present in psychological discourses. Early diagnoses of a state of “crisis” of psychology date back to the end of the 19th century and focused on the question of the correct metaphysical foundation of psychology. During the interwar period, warnings of a disintegration of the discipline reached their first climax in German academia, when many eminent psychologists expressed their worries about the increasing fragmentation of the discipline. The rise of totalitarian systems in the 1930s brought an end to these debates, silencing the theoretical polyphony with physical violence. The 1960s saw a resurgence of “crisis literature” and the emergence of a more positive connotation of the concept in U.S.-American experimental psychology, when it was connected with Thomas Kuhn’s ideas of scientific “revolutions” and “paradigm shifts.” Since that time, psychological crisis literature has revolved around the question of unity, disunity, and the scientific status of the discipline. Although psychological crisis literature showed little success in solving the fundamental problems it addressed, it still provides one of the most theoretically rich and thought-provoking bodies of knowledge for theoretical and historical analyses of the discipline.

Article

Influential theorists of pre-adult phases of the development of the individual person (infancy, childhood, and adolescence) have articulated myriad versions of stage theories, varying in specificity, rigidity, and many other parameters. Some stage theories are concerned with capacities defined somewhat narrowly and operationally defined by behavior. Elsewhere on the spectrum, some of the most influential stage theories have purported to indicate capacities or modes of considerable generality, by positing deep, structural changes either in intellectual capacity or in terms of some other aspect of human functioning treated as fundamental to the affective and the rational life. Jean Piaget’s stage theory of intellectual (cognitive) development is the paradigm of a theory of structural changes in the capacity for logical thought. Bluntly put, Piaget’s theory takes for granted the key characteristics of the thinking of the emotionally balanced, rational adult and attempts to define the necessary steps by which that state is to be attained from the time one starts life as a baby. Sigmund Freud’s theory of psychosexual stages, especially as articulated by Karl Abraham, is the paradigm of a stage theory in which significant aspects of adult functioning are redefined, rather than taken for granted. The steps intervening from babyhood, as thereafter articulated, thereby take on an innovative character. In both cases the substantial internal consistency of the stage model, notwithstanding numerous empirical shortcomings, has generated a kind of validity. But even such qualified praise cannot now be offered to Stanley Hall’s stage theory of individual development, which seems with hindsight little more than a derivative popularization of the recapitulationary evolutionism of the latter part of the 19th century. From an historical perspective, Hall’s, Freud’s, and Piaget’s stage theories of development are all artefacts, products of the sociocultural and scientific environments of their times.

Article

Personnel and vocational testing has made a huge impact in public and private organizations by helping organizations choose the best employees for a particular job (personnel testing) and helping individuals choose occupations for which they are best suited (vocational testing). The history of personnel and vocational testing is one in which scientific advances were influenced by historical and technological developments. The first systematic efforts at personnel and vocational testing began during World War I when the US military needed techniques to sort through a large number of applicants in a short amount of time. Techniques of psychological testing had just begun to be developed at around the turn of the 20th century and those techniques were quickly applied to the US military effort. After the war, intelligence and personality tests were used by business organizations to help choose applicants most likely to succeed in their organizations. In addition, when the Great Depression occurred, vocational interest tests were used by government organizations to help the unemployed choose occupations that they might best succeed in. The development of personnel and vocational tests was greatly influenced by the developing techniques of psychometric theory as well as general statistical theory. From the 1930s onward, significant advances in reliability and validity theory provided a framework for test developers to be able to develop tests and validate them. In addition, the civil rights movement within the United States, and particularly the Civil Rights Act of 1964, forced test developers to develop standards and procedures to justify test usage. This legislation and subsequent court cases ensured that psychologists would need to be involved deeply in personnel testing. Finally, testing in the 1990s onward was greatly influenced by technological advances. Computerization helped standardize administration and scoring of tests as well as opening up the possibility for multimedia item formats. The introduction of the internet and web-based testing also provided additional challenges and opportunities.

Article

Two different but related developments played an important role in the history of psychologists in the fields of mental health care in Germany during the 20th century. The first development took place in the field of applied psychology, which saw psychological professionals perform mental testing, engage in counseling and increasingly, in psychotherapy in practical contexts. This process slowly began in the first decades of the 20th century and included approaches from different schools of psychotherapy. The second relevant development was the emergence of clinical psychology as an academic sub-discipline of psychology. Having become institutionalized in psychology departments at German universities during the 1960s and 1970s, clinical psychology often defines itself as a natural science and almost exclusively focuses on cognitive-behavioral approaches. There are four phases of the growing relationship between psychology and psychotherapy in Germany in which the two developments were increasingly linked: first, the entry of psychology into psychiatric and psychotherapeutic fields from approximately 1900 until 1945; second, the rise of psychological psychotherapy and the emergence of clinical psychology after World War II until 1972, when the diploma-regulations in West Germany were revised; third, a phase of consolidation and diversification from 1973 until the pivotal psychotherapy law of 1999; and fourth, the shifting equilibrium as established profession and discipline up to the reform of the psychotherapy law in 2019. Overall, the emergence of psychological psychotherapy has not one single trajectory but rather multiple origins in the different and competing academic and professional fields of mental health care.

Article

The history of concepts about the adult and that of research into adult constructs show progression from a simple characterization of growth to a variety of complex constructs that define the terrain. Originally, the term adult encompassed all species and events that had attained full physical maturation, a product connotation. Later, time and events (e.g., marriage, the birth of children) became proxies for adult development. The absence of considerations of adult development was augmented by the fact that, for much of the past, adults could not be seen in long-term individual evolution since lifetimes were not extensive. In the 73 years of Psychological Abstracts, adults under various headings (e.g., adulthood, middle age) was referenced in a mere .01% of citations. The first mention of “adult” in a journal title was in 1994. Into the 21st century, although the exploration of various adult constructs abounds, the use of single terms (e.g., intelligence, wisdom) to describe multidimensional attributes leads to misunderstanding and reductionism. There is scant cross-construct analysis and, along with its parent discipline of psychology, analysis of adult development remains at the nascent descriptive level. Looking at the two major constructs of adult personality and intelligence, personality has had the lion’s share of publications. An examination of trends in its analysis reveals that the constructs are defined in various ways, little in the way of socio-contextual appraisal has occurred, and, with respect to the appraisal of intelligence, motivation to perform is ill-examined.

Article

Mike J.F. Robinson, Alicia S. Zumbusch, and Patrick Anselme

Many theoretical constructs have been formulated over the years to explain the phenomenon of addiction. While the incentive sensitization theory of addiction acknowledges the important contributions of many former theories, it postulates that addiction is a state of aberrant motivation. Through repeated drug use, individuals with addiction become hypersensitive to the effects of the drugs themselves and to the stimuli associated with these drugs, including a variety of drug paraphernalia. For all individuals consuming drugs, drug-related stimuli have an inherent predictive value that signals an impending dose of the drug. For people with addiction, these drug cues move beyond being merely predictors for the drug and are imbued with excessive motivational value (called incentive salience); they become powerful motivational magnets capable of instigating and enhancing cravings for the drug. This incentive sensitization occurs through a process of neuroadaptations in the mesocorticolimbic dopamine system that have been shown to be long-lasting. These brain changes yield increasingly intense, highly focused cravings for an addictive target and transform cues related to the target into incentive stimuli that promote compulsive reward-seeking and relapse. The incentive sensitization theory does not deny a role for pleasure, habits, and withdrawal in addiction, but posits that those individuals with addiction (a) continue to take drugs compulsively even while experiencing diminished pleasure, (b) demonstrate creative new ways to procure drugs when necessary, and (c) often relapse well beyond when withdrawal has subsided. The critical factor in the development and maintenance of addiction is the persistent neuroadaptation that sensitizes the attribution of incentive salience to drugs and their cues, which explains why recovering from addiction is a long and slow process. The incentive sensitization theory can account for drug-induced attentional bias as well as how addiction can develop toward nondrug reward sources such as food, sex, and gambling environments.

Article

The Macy Conferences on Cybernetics were a series of 10 interdisciplinary scientific meetings that took place in New York between 1946 and 1953. The meetings were sponsored by the Macy Foundation, which aimed to promote interdisciplinary approaches to the social, behavioral, and medical sciences. Co-organized by neuropsychiatrist Warren S. McCulloch and Frank Fremont-Smith, medical director of the Macy Foundation, the meetings brought together a variety of scientists from mathematics, psychology, engineering, anthropology, physics, ecology, psychiatry, neurophysiology, linguistics, and sociology. The conferences strove to apply tools from the physical sciences and mathematics to problems in the biological and human sciences. Such tools stemmed first from Norbert Wiener’s work on the anti-aircraft predictor, in which he employed the concept of negative feedback to explain purposeful behavior, and second from McCulloch’s work with Walter Pitts on the logic of neural activity, which purported to embody logical reasoning in the physiology of the brain. Wiener and McCulloch touted the practice of hypothetical modelling as a bridge over the divide between the natural and the artificial, and a method for explaining purposeful behavior in organisms and machines. Discussions at the Macy Conferences expanded on this work, and participants discussed and debated models of cognitive functions such as sensation, communication, memory, and learning, all cast as functions of the mind and exemplars of purposeful behavior. Thus, the meetings signal a major shift in 20th-century psychology, when discussions of the mind took on a more central place in psychological discourse. Behaviorist psychologists in the early 20th century had largely rejected concepts of mind as unscientific and not objective. The Macy Conferences, in contrast, placed the mind at the nexus of interdisciplinary inquiry across the divide between the physical and human sciences, and helped to bring back the mind as a topic of objective, scientific inquiry in psychology and in the emerging cognitive sciences.

Article

Vanessa L. Burrows

Stress has not always been accepted as a legitimate medical condition. The biomedical concept stress grew from tangled roots of varied psychosomatic theories of health that examined (a) the relationship between the mind and the body, (b) the relationship between an individual and his or her environment, (c) the capacity for human adaptation, and (d) biochemical mechanisms of self-preservation, and how these functions are altered during acute shock or chronic exposure to harmful agents. From disparate 19th-century origins in the fields of neurology, psychiatry, and evolutionary biology, a biological disease model of stress was originally conceived in the mid-1930s by Canadian endocrinologist Hans Selye, who correlated adrenocortical functions with the regulation of chronic disease. At the same time, the mid-20th-century epidemiological transition signaled the emergence of a pluricausal perspective of degenerative, chronic diseases such as cancer, heart disease, and arthritis that were not produced not by a specific etiological agent, but by a complex combination of multiple factors which contributed to a process of maladaptation that occurred over time due to the conditioning influence of multiple risk factors. The mass awareness of the therapeutic impact of adrenocortical hormones in the treatment of these prevalent diseases offered greater cultural currency to the biological disease model of stress. By the end of the Second World War, military neuropsychiatric research on combat fatigue promoted cultural acceptance of a dynamic and universal concept of mental illness that normalized the phenomenon of mental stress. This cultural shift encouraged the medicalization of anxiety which stimulated the emergence of a market for anxiolytic drugs in the 1950s and helped to link psychological and physiological health. By the 1960s, a growing psychosomatic paradigm of stress focused on behavioral interventions and encouraged the belief that individuals could control their own health through responsible decision-making. The implication that mental power can affect one’s physical health reinforced the psycho-socio-biological ambiguity that has been an enduring legacy of stress ever since. This article examines the medicalization of stress—that is, the historical process by which stress became medically defined. It spans from the mid-19th century to the mid-20th century, focusing on these nine distinct phases: 1. 19th-century psychosomatic antecedent disease concepts 2. The emergence of shell-shock as a medical diagnosis during World War I 3. Hans Selye’s theorization of the General Adapation Syndrome in the 1930s 4. neuropsychiatric research on combat stress during World War II 5. contemporaneous military research on stress hormones during World War II 6. the emergence of a risk factor model of disease in the post-World War II era 7. the development of a professional cadre of stress researchers in the 1940s and 50s 8. the medicalization of anxiety in the early post–World War II era 9. The popularization of stress in the 1950s and pharmaceutical treatments for stress, marked by the cultural assimilation of paradigmatic stress behaviors and deterrence strategies, as well pharmaceutical treatments for stress.

Article

Nikos Ntoumanis, Cecile Thørgersen-Ntoumani, Eleanor Quested, and Nikos Chatzisarantis

Compelling evidence worldwide suggests that the number of physically inactive individuals is high, and it is increasing. Given that lack of physical activity has been linked to a number of physical and mental health problems, identifying sustainable, cost-effective, and scalable initiatives to increase physical activity has become a priority for researchers, health practitioners, and policymakers. One way to identify such initiatives is to use knowledge derived from psychological theories of motivation and behavior change. There is a plethora of such theories and models that describe a variety of cognitive, affective, and behavioral mechanisms that can target behavior at a conscious or an unconscious level. Such theories have been applied, with varying degrees of success, to inform exercise and physical activity interventions in different life settings (e.g., schools, hospitals, and workplaces) using both traditional (e.g., face-to-face counseling and printed material) and digital technology platforms (e.g., smartphone applications and customized websites). This work has offered important insights into how to create optimal motivational conditions, both within individuals and in the social environments in which they operate, to facilitate long-term engagement in exercise and physical activity. However, we need to identify overlap and synergies across different theoretical frameworks in an effort to develop more comprehensive, and at the same time more distinct, theoretical accounts of behavior change with reference to physical activity promotion. It is also important that researchers and practitioners utilize such theories in interdisciplinary research endeavors that take into account the enabling or restrictive role of cultural norms, the built environment, and national policies on physical activity.

Article

Cognitive neuroimaging studies often report that older adults display more activation of neural networks relative to younger adults, referred to as overactivation. Greater or more widespread activity frequently involves bilateral recruitment of both cerebral hemispheres, especially the frontal cortex. In many reports, overactivation has been associated with superior cognitive performance, suggesting that this activity may reflect compensatory processes that offset age-related decline and maintain behavior. Several theories have been proposed to account for age differences in brain activation, including the Hemispheric Asymmetry Reduction in Older Adults (HAROLD) model, the Posterior-Anterior Shift in Aging (PASA) theory, the Compensation-Related Utilization of Neural Circuits Hypothesis (CRUNCH), and the Scaffolding Theory of Aging and Cognition (STAC and STAC-r). Each model has a different explanatory scope with regard to compensatory processes, and each has been highly influential in the field. HAROLD contrasts the general pattern of bilateral prefrontal activation in older adults with that of more unilateral activation in younger adults. PASA describes both anterior (e.g., frontal) overactivation and posterior (e.g., occipital) underactivation in older adults relative to younger adults. CRUNCH emphasizes that the level or extent of brain activity can change in response to the level of task demand at any age. Finally, STAC and STAC-r take the broadest perspective to incorporate individual differences in brain structure, the capacity to implement functional scaffolding, and life-course neural enrichment and depletion factors to predict cognition and cognitive change across the lifespan. Extant empirical work has documented that compensatory overactivation can be observed in regions beyond the prefrontal cortex, that variations in task difficulty influence the degree of brain activation, and that younger adults can show compensatory overactivation under high mental demands. Additional research utilizing experimental designs (e.g., transcranial magnetic stimulation), longitudinal assessments, greater regional precision, both verbal and nonverbal material, and measures of individual difference factors will continue to refine our understanding of age-related activation differences and adjudicate among these various accounts of neurocognitive aging.

Article

Thomas F. Pettigrew

Prejudice, especially intergroup prejudice, has long been a central topic of social psychology. The discipline has sought to be both socially relevant and useful. Thus, theory and research on prejudice fits directly into these central concerns of the discipline. The study of this topic has developed in direct correspondence with how social psychology itself has been able to devise new theoretical and empirical tools—from self-administered questionnaires and probability sample surveys to laboratory experiments and computer-assisted methods. Given the discipline’s intense research interest in intergroup prejudice, it is not surprising that that there is a plethora of theories concerning prejudice. But these many theories tend not to conflict with one another. Rather, they typically coalesce around interrelated themes across three levels of analysis. The micro level of the attitudes of individuals was the primary focus for the first half-century of modern social psychology (1920–1970). Slowly, the field turned its attention to the meso level of intergroup interaction and how such contact influenced intergroup prejudice and discrimination. Finally, the discipline began to consider more systematically the many relevant structural and cultural factors at the macro level of analysis and how they shaped both intergroup prejudice and discrimination. With time, direct links between the three principal levels of analysis have been uncovered. With this order of attention, social psychology boasts many more theories and studies of prejudice at the micro level of individuals than at other levels. But the field has learned that all three levels of analysis are critical for a fully rounded, more complete understanding of the topic.

Article

Laurence R. Harris

Self-orientation perception refers to our perceived self-orientation relative to gravity. An internal representation of self-orientation is derived from sensory cues indicating either directly or indirectly the direction of gravity relative to the body. The internal representation can be measured in the laboratory or clinic using visual or haptic measures, or even the body itself. However, each measure is affected differently by the availability of cues and the observer’s actual orientation relative to gravity, suggesting multiple, simultaneous representations of gravity. Visual, vestibular, somatosensory and proprioceptive cues are combined by multisensory integration to provide the most reliable estimates. Multisensory integration provides a robust perception of self-orientation for adults but means that children have a much lower precision in judging vertical before multisensory integration mechanisms are mature. The neurophysiological basis of the perception of self-orientation is a network of brain areas reflecting the multisensory processes that underlie it. This network provides some redundancy that can be exploited for potential patient recovery. Future work will perfect models for predicting perceived self-orientation in ever more challenging situations and how we can improve performance of pilots, divers and astronauts as they explore new situations and new gravity fields, and improve how we, and especially older people, can continue to enjoy our lifelong dance with gravity.

Article

Halie Olson and Anila D'Mello

Humans are fundamentally social animals, and a large portion of the human brain is dedicated to social cognition—the set of mental functions and processes that scaffold our ability to observe, understand, and interact with others. While early philosophers and scientists relied on observation or isolated cases of brain damage to gain insight into social cognition, the advent of new technologies, including noninvasive neuroimaging, has opened a new window into the brain regions that support social cognition in humans, referred to as the social brain. These technologies have elucidated with new precision that individual brain regions are specialized for a variety of social functions including comprehending language, processing faces and emotions, anticipating what a social partner might do next, and even thinking about others’ thoughts. While the building blocks for the social brain are present from birth, individual regions continue to develop into adulthood and are shaped by experience.

Article

The sociotechnical approach, developed by psychologists at the Tavistock Institute of Human Relations in the 1950s, proposes that the design of work should seek to optimize both the social and the technical systems within organizations, offering a counter to ideas of technological determinism. It further suggests that organizations should be viewed as open systems, subject to sometimes unpredictable external and internal influences leading to a need for adaptability. The work group is viewed as the most relevant unit of analysis resulting in advocacy of autonomous work groups offering group members high levels of control over their work. Workers should participate in the design of their work and receive training and support to enable their involvement. This influential concept stimulated a large body of research in many countries. Despite some notable positive examples, outcomes were often mixed, reflecting the challenges of managing and sustaining significant change. The concept of joint optimization has also proved problematic, with psychologists tending to focus on the social system, while engineers give greater emphasis to the technical system. The advent of digital technologies is providing a new impetus to the need to design work to optimize both the social and technical systems, provoking renewed interest in the approach.

Article

Thirst  

Neil E. Rowland

Thirst is a specific and compelling sensation, often arising from internal signals of dehydration but modulated by many environmental variables. There are several historical landmarks in the study of thirst and drinking behavior. The basic physiology of body fluid balance is important, in particular the mechanisms that conserve fluid loss. The transduction of fluid deficits can be discussed in relation to osmotic pressure (osmoreceptors) and volume (baroreceptors). Other relevant issues include the neurobiological mechanisms by which these signals are transformed to intracellular and extracellular dehydration thirsts, respectively, including the prominent role of structures along the lamina terminalis. Other considerations are the integration of signals from natural dehydration conditions, including water deprivation, thermoregulatory fluid loss, and thirst associated with eating dry food. These mechanisms should also be considered within a broader theoretical framework of organization of motivated behavior based on incentive salience.

Article

Christine Purdon

The idea that suppressing an unwanted thought results in an ironic increase in its frequency is accepted as psychological fact. Wegner’s ironic processes model has been applied to understanding the development and persistence of mood, anxiety, and other difficulties. However, results are highly inconsistent and heavily influenced by experimental artifact. There are a substantial number of methodological considerations and issues that may underlie the inconsistent findings in the literature. These include the internal and external validity of the paradigms used to study thought suppression, conceptual issues such as what constitutes a thought, and consideration of participants’ history with and motivation to suppress the target thought. Paradigms that study the products of failed suppression, such as facilitated recall and attentional deployment to thought relevant stimuli may have greater validity. It is argued that a shift from conceptualizing the persistence of unwanted thoughts as products of failed suppression and instead as internal threat stimuli may have merit.

Article

Training is the systematic processes initiated by the organization that facilitate relatively permanent changes in the knowledge, skills, or affect/attitudes of organizational members. Cumulative meta-analytic evidence indicates that training is effective, producing, on average, moderate effect sizes. Training is most effective when designed so that trainees are active and encouraged to self-regulate during training, and when it is well-structured and requires effort on the part of trainees. Additional characteristics of effective training are: The purpose, objectives, and intended outcomes of training are clearly communicated to trainees; the training content is meaningful, and training assignments, examples, and exercises are relevant to the job; trainees are provided with instructional aids that can help them organize, learn, and recall training content; opportunities for practice in a safe environment are provided; feedback is provided by trainers, observers, peers, or the task itself; and training enables learners to observe and interact with others. In addition, effective training requires a prior needs assessment to ensure the relevance of training content and provides conditions to optimize trainees’ motivation to learn. After training, care should be taken to provide opportunities for trainees to implement trained skills, and organizational and social support should be in place to optimize transfer. Finally, it is important that all training be evaluated to ensure learning outcomes are met and that training results in increased job performance and/or organizational effectiveness.

Article

Well-being is a core concept for both individuals, groups and societies. Greater understanding of trajectories of well-being in later life may contribute to the achievement and maintenance of well-being for as many as possible. This article reviews two main approaches to well-being: hedonic and eudaimonic well-being, and shows that it is not chronological age per se, but various factors related to age that underlie trajectories of well-being at older ages. Next to the role of genes, heritability and personality traits, well-being is determined to a substantial extent by external circumstances and resources (e.g., health and social relationships), and to malleable individual behaviors and beliefs (e.g., self-regulatory ability and control beliefs). Although many determinants have been identified, it remains difficult to decide which of them are most important. Moreover, the role of some determinants varies for different indicators of well-being, such as positive affect and life satisfaction. Several prominent goal- and need-based models of well-being in later life are discussed, which explicate mechanisms underlying trajectories of well-being at older ages. These are the model of Selection, Optimization, and Compensation, the Motivational Theory of Lifespan Development, Socio-emotional Selectivity Theory, Ryff’s model of Psychological Well-Being, Self-Determination Theory, and Self-Management of Well-being theory. Also, interventions based on these models are reviewed, although not all of them address older adults. It is concluded that the literature on well-being in later life is enormous, and, together with various conceptual models, offers many important insights. Still, the field would benefit from more theoretical integration, and from more attention to the development and testing of theory-based interventions. This remains a challenge for the science of well-being in later life, and could be an important contribution to the well-being of a still growing proportion of the population.

Article

The history of psychology is characterized by unparalleled complexity of its methodology and uniquely ambiguous subject matter closely entangled with issues of power, social justice, and ethics. This complexity requires inordinate levels of reflexivity and conceptual sophistication. In effect, a historian of psychology needs to explicate no less than one’s worldview—a broad position as to how people are situated in the world, relate to, change, and get to know it, and how knowledge develops through time—all coupled with one’s broad sociopolitical ethos. Traditional histories of psychology have operated with an astonishing lack of reflection about these issues. One of many deplorable results is that psychology still grapples with its racist and sexist legacies and lacks awareness of social injustices in existence today. The recently emerging approaches have begun to remedy this situation by focusing on situated practices of knowledge production. This article addresses how human agency can be integrated into these approaches, while focusing on knowledge production as not only situated in context but also, and critically, as a world-forming and history-making process. In tackling the shortcomings of relational approaches including social constructionism, the transformative activist stance approach draws on Marxist philosophy and epistemology—infused with insights from Vygotsky’s psychology and other critical theories of resistance. The core point is that knowledge is achieved in and through collaborative community practices realized by individually unique contributions as these come to embody and enact, in an inseparable blend, both cultural-historical contexts and unique commitments and agency of community members. The acts of being-doing-knowing are non-neutral, transformative processes that produce the world, its history and also people themselves, all realized in the process of taking up the world, rather than passively copying it or coping with it. And since reality is in-the-making by people themselves, knowing is about creating the world and knowing it in the very act of bringing about transformative and creative change. Thus, the historicity and situativity of knowledge are ascertained alongside a focus on its ineluctable fusion with an activist, future-oriented, political-ethical stance. Therefore, the critical challenge for the history of psychology is to understand producers of knowledge in their role of actors in the drama of life (rather than only of ideas), that is, as agents of history- and world-making, while also engaging in self-reflection on the historians’ own role in these processes, in order to practice history in responsive and responsible, that is, activist ways.

Article

Craig D. Parks

A social dilemma is a situation of interdependence between people in which there is conflict between doing what is best for oneself, and doing what is best for the group: Trying to produce the best personal outcome (selfishness) hurts the group effort, and contributing to the group effort (cooperation) leads to a less-than-optimal personal outcome. The best personal outcome is realized by acting for oneself when everyone else acts for the group. Because of this, if each group member does what is best for him or herself, the group will fail, and each person will end up with a poor outcome. Solution of a social dilemma thus requires that at least some people forgo selfish interest in favor of the collective. Research into social dilemmas is primarily oriented around identifying the influences on a person’s willingness to cooperate and designing interventions that will encourage more frequent cooperation. There are many real examples of social dilemmas: clean air, charities, public broadcasting, and groundwater, to name a few.