41-60 of 518 Results

Article

Judith F. Kroll and Guadalupe A. Mendoza

There has been an upsurge of research on the bilingual mind and brain. In an increasingly multilingual world, cognitive and language scientists have come to see that the use of two or more languages provides a unique lens to examine the neural plasticity engaged by language experience. But how? It is now uncontroversial to claim that the bilingual’s two languages are continually active, creating a dynamic interplay across the two languages. But there continues to be controversy about the consequences of that cross-language exchange for how cognitive and neural resources are recruited when a second language is learned and used actively and whether native speakers of a language retain privilege in their first acquired language. In the earliest months of life, minds and brains are tuned differently when exposed to more than one language from birth. That tuning has been hypothesized to open the speech system to new learning. But when initial exposure is to a home language that is not the majority language of the community—the experience common to heritage speakers—the value of bilingualism has been challenged, in part because there is not an adequate account of the variation in language experience. Research on the minds and brains of bilinguals reveals inherently complex and social accommodations to the use of multiple languages. The variation in the contexts in which the two languages are learned and used come to shape the dynamics of cross-language exchange across the lifespan.

Article

Suzanne McKee and Preeti Verghese

This article describes human binocular vision. While it is focused primarily on human stereopsis, it also briefly tells about other binocular functions, including binocular summation, rivalry, and vergence, the eye movement that is driven by stereopsis. Stereopsis refers to the depth perception generated by small differences in the locations of visual features in the two retinal images; these differences in retinal location are called disparities. Disparities are detected by special binocularly driven cortical neurons whose properties are outlined here; the article also describes studies that have used fMRI imaging to show that many areas of human cortex respond to depth based on disparity. The development of stereopsis in human infants, as well as clinical abnormalities in stereopsis, is also documented.

Article

In ecological sciences, biodiversity is the dispersion of organisms across species and is used to describe the complexity of systems where species interact with each other and the environment. Some argue that biodiversity is important to cultivate and maintain because higher levels are indicative of health and resilience of the ecosystem. Because each species performs functional roles, more diverse ecosystems have greater capability to respond, maintain function, resist damage, and recover quickly from perturbations or disruptions. In the behavioral sciences, diversity-type constructs and metrics are being defined and operationalized across a variety of functional domains (socioemotional, self, cognitive, activities and environment, stress, and biological). Emodiversity, for instance, is the dispersion of an individual’s emotion experiences across emotion types (e.g., happy, anger, sad). Although not always explicitly labeled as such, many core propositions in lifespan developmental theory—such as differentiation, dedifferentiation, and integration—imply intraindividual change in diversity and/or interindividual differences in diversity. For example, socioemotional theories of aging suggest that as individuals get older, they increasingly self-select into more positive valence and low arousal emotion inducing experiences, which might suggest that diversity in positive and low arousal emotion experiences increases with age. When conceptualizing and studying diversity, important considerations include that diversity (a) provides a holistic representation of human systems, (b) differs in direction, interpretation, and linkages to other constructs such as health (c) exists at multiple scales, (d) is context-specific, and (e) is flexible to many study designs and data types. Additionally, there are also a variety of methodological considerations in study of diversity-type constructs including nuances pertaining theory-driven or data-driven approaches to choosing a metric. The relevance of diversity to a broad range of phenomena and the utility of biodiversity metrics for quantifying dispersion across categories in multivariate and/or repeated measures data suggests further use of biodiversity conceptualizations and methods in studies of lifespan development.

Article

The intelligence test consists of a series of exercises designed to measure intelligence. Intelligence is generally understood as mental capacity that enables a person to learn at school or, more generally, to reason, to solve problems, and to adapt to new (challenging) situations. There are many types of intelligence tests depending on the kind of person (age, profession, culture, etc.) and the way intelligence is understood. Some tests are general, others are focused on evaluating language skills, others on memory, on abstract and logical thinking, or on abilities in a wide variety of areas, such as, for example, recognizing and matching implicit visual patterns. Scores may be presented as an IQ (intelligence quotient), as a mental age, or simply as a point on a scale. Intelligence tests are instrumental in ordering, ranking, and comparing individuals and groups. The testing of intelligence started in the 19th century and became a common practice in schools and universities, psychotechnical institutions, courts, asylums, and private companies on an international level during the 20th century. It is generally assumed that the first test was designed by the French scholars A. Binet and T. Simon in 1905, but the historical link between testing and experimenting points to previous tests, such as the word association test. Testing was practiced and understood in different ways, depending not only on the time, but also on the concrete local (cultural and institutional) conditions. For example, in the United States and Brazil, testing was immediately linked to race differences and eugenic programs, while in other places, such as Spain, it was part of an attempt to detect “feebleness” and to grade students at certain schools. Since its beginning, the intelligence test received harsh criticism and triggered massive protests. The debate went through the mass media, leading to the infamous “IQ test wars.” Thus, nowadays, psychologists are aware of the inherent danger of cultural discrimination and social marginalization, and they are more careful in the promotion of intelligence testing. In order to understand the role the intelligence test plays in today’s society, it is necessary to explore its history with the help of well-documented case studies. Such studies show how the testing practice was employed in national contexts and how it was received, used, or rejected by different social groups or professionals. Current historical research adopts a more inclusive perspective, moving away from a narrative focused on the role testing played in North-America. New work has appeared that explores how testing was taking place in different national and cultural environments, such as Russia (the former Soviet Union), India, Italy, the Netherlands, Sweden, Argentina, Chile, and many other places.

Article

Thomas M. Hess, Erica L. O'Brien, and Claire M. Growney

Blood pressure is a frequently used measure in studies of adult development and aging, serving as a biomarker for health, physiological reactivity, and task engagement. Importantly, it has helped elucidate the influence of cardiovascular health on behavioral aspects of the aging process, with research demonstrating the negative effect of chronic high blood pressure on various aspects of cognitive functioning in later life. An important implication of such research is that much of what is considered part and parcel of getting older may actually be reflective of changes in health as opposed to normative aging processes. Research has also demonstrated that situational spikes in blood pressure to emotional stressors (i.e., reactivity) also have implications for health in later life. Although research is still somewhat limited, individual differences in personal traits and living circumstances have been found to moderate the strength of reactive responses, providing promise for the identification of factors that might ameliorate the effects of age-related changes in physiology that lead to normative increases in reactivity. Finally, blood pressure has also been successfully used to assess engagement levels. In this context, recent work on aging has focused on the utility of blood pressure as a reliable indicator of both (a) the costs associated with cognitive engagement and (b) the extent to which variation in these costs might predict both between-individual and age-related normative variation in participation in cognitively demanding—but potentially beneficial—activities. This chapter elaborates on these three approaches and summarizes major research findings along with methodological and interpretational issues.

Article

Ananiev’s approach shares the Activity Theory (AT) paradigm, dominant in Soviet psychology. Ananiev builds on the main fundamentals of the AT paradigm, considering psyche as a special procreation of the matter, engendered by the active interaction of the individual with the environment. The unique feature of his approach to AT is that he turned it “toward the inside,” focusing on the relation of the human individual to his own physicality, to his own bodily substrate. Ananiev sought by his intention to keep a holistic vision of a human being, considering the latter in the context of his real life, that is, the bodily substrate in its biological specificity in context of the concrete sociohistorical life course of the personality. Like no other psychologist, Ananiev did not limit his research to the sphere of narrowly defined mental phenomena. He conducted a special kind of research, labeled as “complex,” in the course of which characteristics of the same subjects: sociological, socio-psychological, mental, physiological, and psychophysiological indicators—life events of the subjects—were monitored for many years. He focused on ontogenetic development in adulthood, which he, ahead of his time, considered as a period of dynamic changes and differentiated development of functions. The focus of his attention was on individual differences in the ontogenetic development of mental and psycho-physiological functions, especially those deviations from general regularities that resulted from the impact of the life course of the individual. Individualization, the increase of individual singularity, is the main effect of human development and its measure for Ananiev. Ananiev developed a number of theoretical models and concepts. The best-known of Ananiev’s heritage is his theoretical model of human development, often named the “individuality concept.” According to this model, humans do not have any preassigned “structure of personality” or “initial harmony.” The starting point of human development is a combination of potentials—resources and reserves, biological and social. The human creates himself in the process of interaction with the world. Specialization, individually specific development of functions, appears here not as a distortion of the pre-set harmony of the whole but as the way of self-determining progressive human development. The most important practical task of psychology he viewed as psychological support and provision in the process of developing a harmonious individuality, based on the individual potentials.

Article

Holly Bridge

The sensation of vision arises from the detection of photons of light at the eye, but in order to produce the percept of the world, extensive regions of the brain are required to process the visual information. The majority of information entering the brain via the optic nerve from the eye projects via the lateral geniculate nucleus (LGN) of the thalamus to the primary visual cortex, the largest visual area, having been reorganized such that one side of the brain represents one side of the world. Damage to the primary visual cortex in one hemisphere therefore leads to a loss of conscious vision on the opposite side of the world, known as hemianopia. Despite this cortical blindness, many patients are still able to detect visual stimuli that are presented in the blind region if forced to guess whether a stimulus is present or absent. This is known as “blindsight.” For patients to gain any information (conscious or unconscious) about the visual world, the input from the eye must be processed by the brain. Indeed, there is considerable evidence from functional brain imaging that several visual areas continue to respond to visual stimuli presented within the blind region, even when the patient is unaware of the stimulus. Furthermore, the use of diffusion imaging allows the microstructure of white matter pathways within the visual system to be examined to see whether they are damaged or intact. By comparing patients who have hemianopia with and without blindsight it is possible to determine the pathways that are linked to blindsight function. Through understanding the brain areas and pathways that underlie blindsight in humans and non-human primates, the aim is to use modern neuroscience to guide rehabilitation programs for use after stroke.

Article

Robbin Gibb

The process of brain development begins shortly after conception and in humans takes decades to complete. Indeed, it has been argued that brain development occurs over the lifespan. A complex genetic blueprint provides the intricate details of the process of brain construction. Additional operational instructions that control gene and protein expression are derived from experience, and these operational instructions allow an individual to meet and uniquely adapt to the environmental demands they face. The science of epigenetics provides an explanation of how an individual’s experience adds a layer of instruction to the existing DNA that ultimately controls the phenotypic expression of that individual and can contribute to gene and protein expression in their children, grandchildren, and ensuing generations. Experiences that contribute to alterations in gene expression include gonadal hormones, diet, toxic stress, microbiota, and positive nurturing relationships, to name but a few. There are seven phases of brain development and each phase is defined by timing and purpose. As the brain proceeds through these genetically predetermined steps, various experiences have the potential to alter its final form and behavioral output. Brain plasticity refers to the brain’s ability to change in response to environmental cues or demands. Sensitive periods in brain development are times during which a part of the brain is particularly malleable and dependent on the occurrence of specific experiences in order for the brain to tune its connections and optimize its function. These periods open at different time points for various brain regions and the closing of a sensitive period is dependent on the development of inhibitory circuitry. Some experiences have negative consequences for brain development, whereas other experiences promote positive outcomes. It is the accumulation of these experiences that shape the brain and determine the behavioral outcomes for an individual.

Article

The role of experience in brain organization and function can be studied by systematically manipulating developmental experiences. The most common protocols use extremes in experiential manipulation, such as environmental deprivation and/or enrichment. Studies of the effects of deprivation range from laboratory studies in which animals are raised in the absence of sensory or social experiences from infancy to children raised in orphanages with limited caregiver interaction. In both cases there are chronic perceptual, cognitive, and social dsyfunctions that are associated with chronic changes in neuronal structure and connectivity. Deprivation can be more subtle too, such as being raised in a low socioeconomic environment, which is often associated with poverty. Such experience is especially detrimental to language development, which in turn, limits educational opportunities. Unfortunately, the effects of some forms of socioemotional deprivation are often difficult, if not impossible, to ameliorate. In contrast, adding sensory or social experiences can enhance behavioral functions. For example, placing animals in environments that are cognitively, motorically, and/or socially more complex than standard laboratory housing is associated with neuronal changes that are correlated with superior functions. Enhanced sensory experiences can be relatively subtle, however. For example, tactile stimulation with a soft brush for 15 minutes, three times daily for just two weeks in infant rats leads to permanent improvement in a wide range of psychological functions, including motoric, mnemonic, and other cognitive functions. Both complex environments and sensory stimulation can also reverse the negative effects of many other experiences. Thus, tactile stimulation accelerates discharge from hospital for premature human infants and stimulates recovery from stroke in both infant and adult rats. In sum, brain and behavioral functions are exquisitely influenced by manipulation of sensory experiences, especially in development.

Article

Ian Q. Whishaw and Megan Sholomiski

A brain lesion is an area of damage, injury, or abnormal change to a part of the brain. Brain lesions may be caused by head injury, disease, surgery, or congenital disorders, and they are classified by the cause, extent, and locus of injury. Lesions cause many behavioral symptoms. Symptom severity generally corresponds to the region and extent of damaged brain. Thus, behavior is often a reliable indicator of the type and extent of a lesion. Observations of patients suffering brain lesions were first recorded in detail in the 18th century, and lesion studies continue to shape modern neuroscience and to give insight into the functions of brain regions. Recovery, defined as any return of lost behavioral or cognitive function, depends on the age, sex, genetics, and lifestyle of patients, and recovery may be predicted by the cause of injury. Most recovery occurs within the first 6 to 9 months after injury and likely involves a combination of compensatory behaviors and physiological changes in the brain. Children often recover some function after brain lesions better than adults, though both children and adults experience residual deficits. Brain lesion survival rates are improved by better diagnostic tools and treatments. Therapeutic interventions and treatments for brain lesions include surgery, pharmaceuticals, transplants, and temperature regulation, each with varying degrees of success. Research in treating brain lesions is progressing, but in principle a cure will only be complete when brain lesions are replaced with healthy tissue.

Article

Leslee A. Fisher and Lars Dzikus

Bullying is a growing problem in sport and performance settings. Bullying falls under the umbrella of “athlete maltreatment,” which includes any form of harm and all relationships where harm could occur in sport and performance. Specifically, bullying is defined as repeated hostile and deliberate behavior from one person (the perpetrator) to another (the target) with the intent to harm or threaten harm to the target; it is marked by an imbalance of power. Often, after extreme bullying, the target feels terrorized. Athlete maltreatment in sport and performance has been categorized into one of two forms: relational maltreatment and nonrelational maltreatment. Bullying is a relational problem. In particular, sport and performance bullying can occur from coach to player, parent to player, or player to player, and often takes the form of (1) making unreasonable performance demands of the target, (2) repeated threats to restrict or remove the target’s privileges or opportunities, (3) screaming or yelling directed at the target that is unwarranted, (4) repeated and continual criticism of the target’s abilities, (5) discounting or denying the target’s accomplishments, (6) blaming the target for his or her mistakes, (7) threats of and/or actual physical violence toward the target, and (8) social media or e-mail messages with threats or insults toward the target. Sport and performance organizations should develop and implement antibullying policies. Six potential steps toward policy development and implementation include: (1) defining bullying behaviors, (2) referring to existing “best-practice” bullying policies, (3) specifically outlining the reporting of bullying incidents, (4) outlining clearly investigation and disciplinary actions to be taken, (5) outlining specific assistance for bullying targets, and (6) including prevention and training procedures. In the meantime, coaches as well as parents and players can recognize that they are role models for everyone with whom they come into contact in sport and performance settings. Coaches, parents, and players can also accept responsibility for creating a respectful and safe sport and performance environment, have a pre-season meeting to discuss antibullying policy, foster open and honest communication, accept critical feedback, not engage or allow bullying behavior themselves, create acceptable boundaries between themselves and others, and teach players to trust their instincts when things do not feel right. More advanced bullying prevention and training procedures can then take place.

Article

Michael P. Leiter and Jo Wintle

A starting point in examining job burnout is determining its definition. The burnout syndrome of exhaustion, cynicism, and inefficacy is at times distilled into a synonym for exhaustion, leading to some confusion in the research literature. Another critical issue is burnout as a clinical issue requiring treatment for individuals or burnout as a management problem requiring changes in the organization of work and workplaces. Considering burnout as a problem in the relationship of people with workplaces opens additional possibilities for action. Intervention research evaluating systems for alleviating or preventing burnout continue to be rare in the research literature. Furthermore, these studies are largely focused on building individual capacity to endure or thrive in workplaces rather than changing conditions that aggravate exhaustion, cynicism, or inefficacy.

Article

Robert C. Eklund and J.D. Defreese

Athlete burnout is a cognitive-affective syndrome characterized by perceptions of emotional and physical exhaustion, reduced accomplishment, and devaluation of sport. A variety of theoretical conceptualizations are utilized to understand athlete burnout, including stress-based models, theories of identity, control and commitment, and motivational models. Extant research has highlighted myriad antecedents of athlete burnout including higher levels of psychological stress and amotivation and lower levels of social support and psychological need (i.e., autonomy, competence, relatedness) satisfaction. Continued longitudinal research efforts are necessary to confirm the directionality and magnitude of these associations. Moreover, theoretically focused intervention strategies may provide opportunities for prevention and treatment of burnout symptoms via athlete-focused stress-management and cognitive reframing approaches as well as environment-focused strategies targeting training loads and enhancement of athlete psychological need satisfaction. Moving forward, efforts to integrate research and practice to improve burnout recognition, prevention, and intervention in athlete populations likely necessitate collaboration among researchers and clinicians.

Article

Steve A. Nida

The brutal 1964 murder of Kitty Genovese sparked widespread public interest, primarily because it was reported to have taken place in view of some 38 witnesses, most of whom had seen the incident through the windows of their apartments in a high-rise building directly across the street. (Investigative work conducted some 50 years later suggests that there were not that many actual witnesses—more likely as few as seven or eight.) The ensuing analyses provided by newspaper columnists and others tended to focus on the callous indifference that had been demonstrated by those who had chosen not to intervene in the emergency, a state of affairs that came to be known, at least for a while, as “bystander apathy.” (It soon became clear, however, that bystanders in such events are rarely apathetic or indifferent.) Intrigued by the internal and interpersonal dynamics that might be involved, two social psychologists, Bibb Latané and John Darley, began a program of research that led to the conclusion that any notion of “safety in numbers” is illusory. In fact, it is the very presence of other people that may discourage helping in such circumstances. More specifically, other unresponsive bystanders may provide cues suggesting that the event is not serious and that inaction is the appropriate response. In addition, knowing that others are available to help allows the individual bystander to shift some of the responsibility for intervening to the others present, a process that Latané and Darley termed “diffusion of responsibility”; that is, the greater the number of others present, the easier it is for any one individual to assume that someone else will help. Subsequent research has demonstrated that this tendency for the individual to be less likely to help when part of a group than when alone—now known as the “bystander effect”—is a remarkably robust phenomenon. Even though social psychology has developed a thorough understanding of the mechanisms that drive this phenomenon, applying this knowledge is difficult, and significant incidents involving the bystander effect continue to occur.

Article

Benjamin T. Mast and Diana DiGasbarro

Clinicians conduct capacity evaluations to determine an older adult’s ability to make and execute a decision within key domains of functioning. Questions of capacity often arise when an older adult experiences a decline in cognitive functioning due to Alzheimer’s disease, stroke, or severe psychiatric illness, for example. Capacity is related to legal competency, and a lack of capacity may be proved by providing evidence that an older adult is unable to understand the act or decision in question; appreciate the context and consequences of the decision or act; reason about the potential harms and benefits; or express a choice. Capacity is domain-specific, time-specific, and decision-specific. Domains include financial capacity, medical treatment and research consent capacity, driving capacity, sexual consent capacity, and voting capacity. Each capacity domain encompasses activities that may vary in complexity or risk, and thus require different levels of capacity. For example, within the medical treatment consent capacity domain, an older adult may lack the capacity to consent to a complicated and risky surgical procedure while retaining the capacity to consent to a routine blood draw. Clinicians determine capacity by using a combination of tools including capacity assessment instruments, task-specific functional evaluations, interviews with the patient and family members, measures of cognitive functioning, and consideration of social, physical, and mental health factors. Extensive research has been conducted to determine the reliability and validity of a variety of capacity assessment instruments for many domains. These instruments generally assess the patient’s responses to vignettes pertaining to the domain in question, information gleaned from structured and semi-structured interviews, functional ability, or a combination of these methods. Although there is still need for more research, especially in emerging domains, capacity assessments help to protect vulnerable older adults from harm while allowing them to retain the highest possible level of autonomy.

Article

Jos Akkermans, Daniel Spurk, and Nadya Fouad

The field of career studies primarily focuses on understanding people’s lifelong succession of work experiences, the structure of opportunity to work, and the relationship between careers and work and other aspects of life. Career research is conducted by scholars in a variety of disciplines, including psychology, management, and sociology. As such, it covers multiple levels of analysis and is informed by different theoretical frameworks, ranging from micro (i.e., individual) to macro (e.g., organizational, institutional, cultural). The most dominant theoretical perspectives that have been mobilized in career research are boundaryless and protean career theory, career construction theory, and social cognitive career theory. Other perspectives that have increasingly been adopted include sustainable careers, kaleidoscope careers, psychology of working theory, and theories from related disciplines, such as conservation of resources theory and social exchange theory. Key topics in the field of career studies include career self-management, career outcomes (e.g., career success, employability), career transitions and shocks, calling, and organizational career management. Research at the micro level with outcomes on the individual level has been dominant in the early 21st century, predominantly focusing on understanding individual career paths and outcomes. Thereby, however, contextual factors as either further important predictors or boundary conditions for career development are also considered as important research topics.

Article

Caring for an older adult who needs help or supervision is in many cases associated with mental and physical health issues, especially if the care recipient has dementia, although positive consequences associated with caregiving have also been reported. Several theoretical models have shown the relevance of psychological variables for understanding variations in the stress process associated with caregiving and how interventions may benefit from psychological techniques and procedures. Since the 1990s it has been witnessed an increment in the number of studies aimed at analyzing caregiver health and developing and testing interventions for decreasing caregiver distress. Several examples of interventions for helping caregivers are considered empirically supported, including interventions for ethnically and culturally diverse caregivers, with psychotherapeutic and psychoeducational interventions showing strong effect sizes. However, efforts are still needed to maintain the results of the interventions in the long term and to make the interventions accessible (e.g., through technological resources) to a large number of caregivers who, because of time-pressure issues associated with caregiving or a lack of support, are not benefiting from them. Making these interventions available in routine healthcare settings would help a large population in need that presents with high levels of psychological suffering.

Article

Cerebral palsy (CP) is defined as non-progressive damage to the brain at or around birth, which leads to varying symptoms depending on the extent and location of damage. The leading symptom is sensory-motor impairment of varying expression, but additional perceptual, cognitive, and socio-emotional symptoms are common. CP can be divided into four types, with bilateral spastic being by far the most frequent, followed by the unilateral spastic, the dyskinetic, and the ataxic variants. The intellectual, linguistic, and cognitive profile of CP is extremely variant, but all qualities correlate more or less with CP type and motor impairment. Early diagnosis is important since early intervention may promote all developmental dimensions. Generally, individuals with unilateral spastic CP have the best (almost normal) intellectual, linguistic, and cognitive outcomes, while those with bilateral spastic CP fare the worst. Language perception is often an individual strength, while language expression, and particularly speech, may be heavily impaired. Attention and executive functions are often impaired as compared to typically developing controls, even in those children with normal intellectual functioning. The same holds true for visual perceptual functions, which are impaired in almost half of all children and adolescents with CP. The potential neuropsychological dysfunctions are a risk factor for arithmetic functions and literacy. Obstacles to participate in society are high for individuals with CP and heavily dependent on their motor, language, intellectual, and cognitive functions. However, quality of life is good for most children and adolescents, and they develop a sound self-concept. On the other side, bully experience is more common than amongst typically developing children and is associated with behavior problems and executive dysfunction. The development of children and adolescents with CP is determined by a complex interplay between physical, intellectual, and neuropsychological functions.

Article

During World War I, soldiers from all warring countries suffered from mental disorders caused by the strains and shocks of modern warfare. Military psychiatrists in Germany and the Austro-Hungarian Empire were initially overwhelmed by the unexpected numbers of psychiatric patients, and they soon engaged in fierce debates about the etiology and therapy of “war neuroses.” After early therapeutic approaches relying on rest and occupational therapy had failed to yield the necessary results, psychiatry faced increasing pressure by the state and the military. After 1916, the etiological debate coalesced around the diagnosis of “war hysteria,” and psychiatric treatment of war neurotics became dominated by so-called active therapies, which promised to return patients to the frontline or the war industry as quickly and efficiently as possible. War psychiatry became characterized by an unprecedented rationalization of medical treatment, which subordinated the goals of medicine to the needs of the military and the wartime economy. Brutal treatment methods and struggles over pensions led to conflicts between patients and doctors that continued after the war ended.

Article

Exercise is known to exert an influence on pain. Specifically, sensitivity to pain decreases both during and following a single bout of exercise—a phenomenon that has been termed exercise-induced hypoalgesia (EIH). EIH has been shown to occur following a variety of types of exercise including aerobic, dynamic resistance, as well as intermittent and continuous isometric exercise, and with a variety of types of pain stimuli including pressure, thermal, and electrical, among others. Depending upon the type of exercise, the intensity and duration of the exercise bout may affect the magnitude of EIH observed. EIH also may be influenced by presence of chronic pain. In individuals with chronic pain conditions, exercise can have both hypo- and hyperalgesic effects, again depending on the specifics of the exercise stimulus itself. The mechanisms underlying EIH have not been definitively established. However, a number of potentially viable mechanisms have been examined including: release of stress mediators such as adrenocorticotrophic hormone and growth hormone (GH), stimulation of the endogenous opioid system, interactions between the pain modulatory system and the cardiovascular system resulting from shared neurological pathways, activation of the endocannabinoid (eCB) system, and engagement of supraspinal pain inhibitory mechanisms via conditioned pain modulation (CPM). There is also some evidence that psychosocial factors, including pain-related beliefs like catastrophizing and expectation, may influence EIH. Research in EIH has several important implications for research and practice. In healthy adults, reduced sensitivity to pain is a salient benefit of exercise and EIH responses may play a role in exercise adherence. For chronic pain patients, research on EIH has the potential to uncover mechanisms related to maintenance of chronic pain. Improving our understanding of how and why hyperalgesia occurs following exercise in these patients can aid in understanding central nervous system mechanisms of disease maintenance and ultimately may help to avoid symptom exacerbation with exercise. However, there remain practical and mechanistic questions to be examined. Translating reductions in pain sensitivity that occur with exercise under controlled laboratory conditions to situations that are more naturalistic will be an important next step for promoting physical activity as a treatment for pain.