In the 21st century, we have seen a significant increase in the use of alcohol and other drugs (AODs) among older adults in most first world countries. In addition, people are living longer. Consequently, the number of older adults at risk of experiencing alcohol-related harm and substance use disorders (SUDs) is rising. Between 1992 and 2010, men in the United Kingdom aged 65 years or older had increased their drinking from an average 77.6 grams to 97.6 grams per week. Data from Australia show a 17% increase in risky drinking among those 60–69 between 2007 and 2016. Among Australians aged 60 or older, there was a 280% increase in recent cannabis use from 2001 to 2016. In the United States, rates of older people seeking treatment for cocaine, heroin, and methamphetamine have doubled in the past 10 years. This trend is expected to continue. Despite these alarming statistics, this population has been deemed “hidden,” as older adults often do not present to treatment with the SUD as a primary concern, and many healthcare professionals do not adequately screen for AOD use. With age, changes in physiology impact the way we metabolize alcohol and increase the subjective effects of alcohol. In addition, older adults are prone to increased use of medications and medical comorbidities. As such, drinking patterns that previously would have not been considered hazardous can become dangerous without any increase in alcohol consumption. This highlights the need for age-specific screening of all older patients within all healthcare settings. The etiology of AOD-related issues among older adults can be different from that of younger adults. For example, as a result of issues more common as one ages (e.g., loss and grief, identity crisis, and boredom), there is a distinct cohort of older adults who develop SUDs later in life despite no history of previous problematic AOD use. For some older adults who might have experimented with drugs in their youth, these age-specific issues precipitate the onset of a SUD. Meanwhile, there is a larger cohort of older adults with an extensive history of SUDs. Consequently, assessments need to be tailored to explore the issues that are unique to older adults who use AODs and can inform the development of age-specific formulations and treatment plans. In doing so, individualized treatments can be delivered to meet the needs of older adults. Such treatments must be tailored to address issues associated with aging (e.g., reduced mobility) and may require multidisciplinary input from medical practitioners and occupational therapists.
Stephen J. Bright
Vanessa L. Burrows
Stress has not always been accepted as a legitimate medical condition. The biomedical concept stress grew from tangled roots of varied psychosomatic theories of health that examined (a) the relationship between the mind and the body, (b) the relationship between an individual and his or her environment, (c) the capacity for human adaptation, and (d) biochemical mechanisms of self-preservation, and how these functions are altered during acute shock or chronic exposure to harmful agents. From disparate 19th-century origins in the fields of neurology, psychiatry, and evolutionary biology, a biological disease model of stress was originally conceived in the mid-1930s by Canadian endocrinologist Hans Selye, who correlated adrenocortical functions with the regulation of chronic disease. At the same time, the mid-20th-century epidemiological transition signaled the emergence of a pluricausal perspective of degenerative, chronic diseases such as cancer, heart disease, and arthritis that were not produced not by a specific etiological agent, but by a complex combination of multiple factors which contributed to a process of maladaptation that occurred over time due to the conditioning influence of multiple risk factors. The mass awareness of the therapeutic impact of adrenocortical hormones in the treatment of these prevalent diseases offered greater cultural currency to the biological disease model of stress. By the end of the Second World War, military neuropsychiatric research on combat fatigue promoted cultural acceptance of a dynamic and universal concept of mental illness that normalized the phenomenon of mental stress. This cultural shift encouraged the medicalization of anxiety which stimulated the emergence of a market for anxiolytic drugs in the 1950s and helped to link psychological and physiological health. By the 1960s, a growing psychosomatic paradigm of stress focused on behavioral interventions and encouraged the belief that individuals could control their own health through responsible decision-making. The implication that mental power can affect one’s physical health reinforced the psycho-socio-biological ambiguity that has been an enduring legacy of stress ever since. This article examines the medicalization of stress—that is, the historical process by which stress became medically defined. It spans from the mid-19th century to the mid-20th century, focusing on these nine distinct phases: 1. 19th-century psychosomatic antecedent disease concepts 2. The emergence of shell-shock as a medical diagnosis during World War I 3. Hans Selye’s theorization of the General Adapation Syndrome in the 1930s 4. neuropsychiatric research on combat stress during World War II 5. contemporaneous military research on stress hormones during World War II 6. the emergence of a risk factor model of disease in the post-World War II era 7. the development of a professional cadre of stress researchers in the 1940s and 50s 8. the medicalization of anxiety in the early post–World War II era 9. The popularization of stress in the 1950s and pharmaceutical treatments for stress, marked by the cultural assimilation of paradigmatic stress behaviors and deterrence strategies, as well pharmaceutical treatments for stress.