Show Summary Details

Page of

Printed from Oxford Research Encyclopedias, Neuroscience. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 26 June 2022

Neuroendocrinology of Stress and Addictionfree

Neuroendocrinology of Stress and Addictionfree

  • Steven Kinsey, Steven KinseySchool of Nursing, University of Connecticut
  • Olivia Vanegas, Olivia VanegasUniversity of Connecticut, Behavioral Neuroscience
  • Kristen Trexler, Kristen TrexlerDepartment of Pyschology, West Virginia University
  • Floyd SteeleFloyd SteeleDepartment of Psychology, West Virginia University
  •  and Matthew EckardMatthew EckardDepartment of Psychology, West Virginia University


The stress response evolved as a series of neural and endocrine mechanisms that protect the host organism from threats to homeostasis. Repeated use of psychotropic drugs can lead to the development of tolerance (i.e., decreased drug activity at a given dose) and drug dependence, as indicated by withdrawal syndromes following drug abstinence. Drug withdrawal is often overtly stressful, although acute drug exposure may also represent a threat to homeostasis. This article explores the neuroendocrine effects of drugs of abuse and some of the ways in which stress and appetitive mechanisms interact.


  • Neuroendocrine and Autonomic Systems

The stress response is one of the most important physiological processes. Stress can broadly be defined as a complex condition wherein homeostasis is threatened or disrupted (Finn, 2010; Wilder, 1995). Animals can be threatened by acute or chronic psychological, physiological, external, or internal stressors (Finn, 2010). The ability to appropriately respond to incoming threats, whether real or perceived, is necessary for survival. Given the critical role of the stress response in promoting survival, it is not surprising that the catecholaminergic stress response and the hypothalamic–pituitary–adrenal (HPA) axis are widely conserved across species (Schulkin, 2011).

Stressors can be broadly defined as either physical, which originate in sensory fibers that feed to the spinal cord and hindbrain, or psychological, which originate in the prefrontal cortex and hippocampus (Herman et al., 2003; Riebe & Wotjak, 2011). Though the areas initially activated by stressors may be different, the resulting stress response is typically the same. In the HPA stress response, inputs from the amygdala and hippocampus exert regulatory control over the paraventricular nucleus (PVN) of the hypothalamus (Finn, 2010; Riebe & Wotjak, 2011). When stress is introduced, a cascade effect from the amygdala causes the PVN to synthesize and release corticotropin-releasing hormone (CRH) and arginine vasopressin (AVP; Herman et al., 2005; Patel et al., 2005; Pecoraro et al., 2006; Rademacher et al., 2008). CRH and AVP then travel from the PVN to the anterior pituitary along the median eminence of the hypothalamus (Finn, 2010; Riebe & Wotjak, 2011).

When stimulated by CRH and AVP, the anterior pituitary releases adrenocorticotropic hormone (ACTH) into the bloodstream (Kloet et al., 2005; Pecoraro et al., 2006). Once in the bloodstream, ACTH travels throughout the body to the adrenal cortex, nestled on top of the kidney, which releases glucocorticoids (GCs) into the bloodstream (Kloet et al., 2005; Pecoraro et al., 2006). The GCs then travel through the bloodstream and to the brain, where they act on GC receptors in the peripheral and central nervous systems to produce sympathetic nervous system responses like increased heart rate, dampened immune responding, and focused attention (Chrousos & Gold, 1992; Tsigos & Chrousos, 2002). GCs also activate mineralocorticoid receptors that may affect memory performance (Joëls et al., 2006; Roozendaal et al., 2009; Sandi & Pinelo-Nava, 2007). The majority of studies, however, focus on the more widespread effects of GC receptors (Wolf et al., 2016).

Although stress can be unpleasant acutely, its essential role in our survival cannot be overstated. The activation of the HPA axis and stress response has been conserved throughout evolution because of their critical importance in our ability to respond to threats (Dantzer et al., 2014). In the relatively low-threat environment of developed parts of the world, however, frequent HPA axis activation contributes to many health issues that can be very difficult to effectively treat (Finn, 2010). Similarly, use of, and withdrawal from, various psychotropic drugs can activate and alter the stress response, leading to increased risk of prolonged drug use and relapse after abstinence.


Opioids exert their effects on inhibitory G protein-coupled receptors that have three major subtypes: mu, kappa, and delta (Pathan & Williams, 2012). Most of the therapeutic effects of opioid drugs are mediated through opioid receptor activation in the central and peripheral nervous systems, but mu opioid receptors are also expressed throughout the gastrointestinal tract (Sobczak et al., 2014). While mu activation is the primary mechanism of opioid antinociception, a lack of intracellular pathway selectivity or nervous tissue region specificity contributes to adverse side effects, such as constipation, inching, nausea, somnolence, and respiratory failure (Lawlor, 2002). Long-term side effects of opioid use include tolerance, dependence, respiratory depression, constipation, and withdrawal effects, such as diarrhea and chronic pain (Seyfried & Hester, 2012). These withdrawal effects are stressors that contribute to continued opioid use. Opioids also exert direct effects on endocrine tissue. For example, opioid use can worsen a multitude of endocrine disorders, including gonadal and reproductive dysfunction, stress disorders, hypothyroidism, diabetes, and hypertension (Vuong et al., 2010).

Opioids can also exert negative effects on the HPA axis. Paradoxical effects on ACTH secretion are observed in rodents depending on the opioid administration method. Rats receiving acute opioid treatments show enhanced HPA reactivity and increased levels of both ACTH and corticosterone (Jezova et al., 1982, Vuong et al., 2010). In contrast, rats receiving repeated morphine for 1 week have low ACTH plasma levels (Daly, 1996). Rats treated repeatedly with morphine for 16 days and then subjected to 12-hour withdrawal show a greater corticosterone response to restraint stress compared to controls (Houshyar et al., 2001). This exaggerated corticosterone response returned to normal after a week, but blood ACTH levels remained depressed for at least 16 days after morphine cessation.

Morphine exposure itself is a stressor, and chronic morphine administration may downregulate mechanisms shared between the opioid and HPA systems. Chronic morphine use not only blunts the stress response to restraint, but also blunts stress responses to adrenocorticotropic drug challenges (Seyfried & Hester, 2012). In humans with heroin dependence, the circadian rhythms of cortisol and endogenous opioids are dysregulated. Plasma levels of endogenous opioids and cortisol typically peak in the morning and are lowest in the evening. While overall plasma levels of endogenous opioids and cortisol are not different between heroin users and controls, heroin-dependent patients showed no change in opioid or cortisol levels as a function of time (Facchinetti et al., 1984). The increase in evening ACTH and cortisol levels in chronic heroin users can lead to a blunting of the stress response (Seyfried & Hester, 2012).

While chronic opioid treatment suppresses the stress response in both humans and rodents, the human stress response reacts differently to acute opioid challenge, showing reduced levels of ACTH, cortisol, and endogenous opioids (Vuong et al., 2010). Similar to the effects of acute naloxone on human LH levels, acute naloxone administration increases levels of ACTH and endogenous opioids in humans (Baranowska et al., 1985). Former opioid users maintained on methadone showed a blunted stress response to a dexamethasone challenge (Aouizerate et al., 2006). Dose-response curves were generated for ACTH and cortisol response to dexamethasone in healthy individuals, methadone-maintained heroin users, and methadone-maintained heroin users who also used cocaine. Both groups containing chronic methadone users showed decreased ACTH and cortisol in response to dexamethasone when compared to healthy individuals. This suggests that chronic opioid intake can enhance the negative feedback effects of dexamethasone on blunting the activity of the HPA axis.


Unlike opioids, ethyl alcohol is used broadly and with limited social stigma. Alcohol has widespread actions in tissues throughout the body. However, there is considerable evidence that, like benzodiazepines’ effects, alcohol’s major psychoactive effects are associated with positive allosteric activation of γ‎-amino butyric acid (GABAA) receptors and suppression of glutamate release (Wallner & Olsen, 2008). The rewarding effects of alcohol are mediated by the mesolimbic dopamine (DA) pathway and altered serotonergic signaling. Like humans, other animals can be trained to self-administer alcohol. Pioneering work performed in alcohol-dependent rats demonstrated that alcohol self-administration dropped dramatically after a DA antagonist was microinjected into the nucleus accumbens (NAcc), indicating that DA signaling is important for the reinforcing effects of alcohol (Rassnick et al., 1992). The same paradigm was used to demonstrate that an NMDA receptor antagonist also reduced alcohol self-administration, although neither antagonist had an effect on water consumption, again suggesting that the reinforcing effects of alcohol are processed in the NAcc (Rassnick et al., 1992). Alcohol withdrawal is a stressor that decreases serotonin levels in the NAcc of rats, and alcohol reinstatement restores serotonin levels in the NAcc (Weiss et al., 1996). Serotonin deficiency and dysfunction in CRH contribute to anxiety, impulsivity, learned helplessness, and substance use (Maier & Watkins, 2005).


Cannabinoids are molecules that either bind to and activate the cannabinoid (CB) receptors or share structural homology with known CB receptor ligands (Mechoulam & Parker, 2012). The two primary endocannabinoids are 2-arachidonoyl glycerol (2-AG) and anandamide (Devane et al., 1992; Mechoulam et al., 1995). Their levels throughout the body are approximately the same, except within the brain, where 2-AG is present in 100-fold higher amounts than anandamide (Long et al., 2009). Anandamide is primarily catabolized by fatty acid amide hydrolase (FAAH), whereas 2-AG is primarily catabolized by monoacylglycerol lipase (MAGL; Blankman et al., 2007; Cravatt et al., 1996; McKinney & Cravatt, 2005). Inhibiting FAAH or MAGL increases endogenously available anandamide or 2-AG, respectively (Blankman et al., 2007).

In the central nervous system, CB1 receptors are expressed primarily on GABAergic and glutamatergic interneurons (Jacob et al., 2009; Steiner & Wotjak, 2008), whereas CB2 is expressed primarily in the periphery and is also commonly expressed on glial cells and in the brainstem (Finn, 2010). CB1 agonism is associated with psychoactive effects, including mild euphoria, relaxation, motor function disruption, and analgesia, typically reported during cannabis use. CB2 is implicated in anti-inflammatory and immunosuppressive effects that contribute to analgesia, which is a decrease in pain response (Lombard et al., 2007). Both receptors affect intracellular signaling through the inhibition of adenylate cyclase (Howlett, 2005). Both exogenous (i.e., externally administered) and endogenous (i.e., internally produced) cannabinoids bind to CB1 and CB2 receptors with moderate to high affinity (Lombard et al., 2007; Singh et al., 2012).

The endocannabinoid system serves principally as an inhibitory system that acts through decreasing the synthesis of adenosine triphosphate (ATP) and therefore blunts cyclic adenosine monophosphate (cAMP) production and the intracellular cascade effects (for a more complete description, see Flores et al., 2013; Howlett, 2002, 2005; Howlett et al., 1986). Activation of cannabinoid receptors leads to inhibition of adenylyl cyclase and the activation of mitogen-activated protein kinases, which ultimately cause inhibition of responses to stimuli that would normally depolarize the cell and can decrease neurotransmitter release (Freund et al., 2003; Howlett, 2005; Mackie, 2008). Simultaneously, cannabinoid receptor activation closes N- and P/Q-type calcium (Ca2+) ion channels (Flores et al., 2013; Steiner & Wotjak, 2008), and, due to reduced cAMP, inward rectifying potassium (K+) channels are also activated (Deadwyler et al., 1995; Mu et al., 1999). Together, these effects contribute to hyperpolarization of the presynaptic neuron, which inhibits neurotransmitter release and decreases excitatory postsynaptic potentials in the postsynaptic neuron.

One of the many processes modulated by the endocannabinoid system is the stress response (Finn, 2010; Riebe & Wotjak, 2011; Wolf et al., 2016). For many years, it was believed that only slow-acting transcriptional changes and intracellular GC receptors acted as feedback mechanisms for GC release (Kloet et al., 2005). It was hypothesized that the effects of GC receptors were, at least in part, also mediated by some kind of retrograde messenger (Di et al., 2003). Because endocannabinoids act as retrograde messengers, they were a likely target. Indeed, recent studies have revealed that the endocannabinoid system acts as a fast feedback mechanism in the stress response (Hill et al., 2009; Keller-Wood & Dallman, 1984; Patel et al., 2005). Anandamide plays a tonic role in the control of stress-induced activation of the HPA axis, such that anandamide levels must be lowered for the HPA response to initiate (Hill & Gorzalka, 2005).

The most commonly used exogenous cannabinoid, Δ9-tetradrocannabinol (THC), also binds to and activates CB1 and CB2 cannabinoid receptor subtypes. Like the endocannabinoids, THC has broadly inhibitory effects via CB1 and anti-inflammatory effects via CB2. The euphoria, sleepiness, stress relief, analgesia, and appetitive effects (i.e., “the munchies”) that cannabis users commonly report occur when THC molecules bind to CB1 receptors in the brain. Blockade of CB1, for example, with a CB1 receptor antagonist, blocks these psychotropic effects of THC (Pério et al., 1996; Wiley et al., 1995). The second most studied exogenous cannabinoid is cannabidiol (CBD), which may have some anti-inflammatory, analgesic, and antinausea properties (Rock et al., 2021). Strong evidence for CBD activity in humans is sparse, so much of what is known is based on animal and cell culture studies, which reveal that CBD activates 5-HT1A serotonergic receptors (Mishima et al., 2005) and is unlikely to activate cannabinoid receptors in live animals.

Cannabinoids act through glutamatergic neurons in the PVN to inhibit glutamate release, thus reducing activity in the PVN and ultimately reducing CRH release in the PVN (Di et al., 2003; Evanson et al., 2010; Malcher-Lopes et al., 2006; Steiner et al., 2008). This decrease in CRH reduces ACTH transmission and, thus, reduces GC release from the adrenal gland (Keller-Wood & Dallman, 1984; Pecoraro et al., 2006). Further, endocannabinoids contribute strongly to stress habituation (Hill et al., 2010; Patel et al., 2005). Repeated stress increases anandamide and decreases 2-AG in the amygdala, which results in increased basal secretion of GCs and HPA habituation, respectively (Hill et al., 2010). Decreased levels of 2-AG may also indicate a loss of depolarization-induced suppression of inhibition, which occurs following strong activation, and inhibitory feedback in the basolateral amygdala and PVN (Patel et al., 2009; Pitler & Alger, 1992; Vincent et al., 1992; Wamsteeker et al., 2010).

The endocannabinoid mediation of the stress response occurs through CB1. The synthetic GC dexamethasone decreases the likelihood of a miniature excitatory postsynaptic potential (mEPSP; i.e., it makes the cell less likely to fire), an effect that is blocked by the CB1-selective antagonist AM281 (Di et al., 2003). Moreover, when a synthetic CB agonist, WIN55-212, was used to activate CB1, the reduction of EPSPs was equal to that caused by dexamethasone, such that the addition of dexamethasone did not decrease mEPSPs further (Di et al., 2003). This confirms that the depressive activity of GC receptors is mediated by a mechanism that requires CB1 activation.

Many studies have examined the role of CB1 in mediating the stress response. CB1-deficient mice exhibit increased HPA axis activity, as measured through increased CRH and ACTH (Steiner & Wotjak, 2008). This is further supported by the revelation that rats exposed to chronic stress have decreased anxiety-like behavior in the presence of a drug that increases endocannabinoid levels, as opposed to unstressed rats, which exhibit increased anxiety-like behavior when given the same compound (Hill & Gorzalka, 2004). Taken together, the findings of these studies indicate that the endocannabinoid system is involved in mediation of the HPA axis stress response.

Our growing knowledge of the endocannabinoid system and its effects on stress indicate that it holds promising targets for treating stress-related disorders. Indeed, humans have self-medicated with cannabis for thousands of years (Mechoulam & Parker, 2012), although chronic cannabis use presents its own challenges, including cannabis use disorder in some users.



Cocaine is a psychomotor stimulant that produces its psychoactive effects by binding to, and blocking, protein transporters responsible for removing DA, serotonin, and norepinephrine from the synapse (Pomara et al., 2012). This inhibition of neurotransmitter reuptake prolongs the action of these neurotransmitters in brain regions important for reward processing, including the VTA and NAcc, which contribute to its euphoric effects. In 2017, approximately 2% of the U.S. population reported using cocaine within the past year, with approximately 1% being current cocaine users (within the last month; Substance Abuse and Mental Health Services Administration (SAMHSA), 2018). Given the overlap between neural circuits regulating food intake and psychostimulant drugs (e.g., mesolimbic structures, including the VTA and NAcc), recent research has focused on using appetitive hormones and their analogs to dissect how they influence behavioral and neural responses to cocaine and other psychostimulants.

Several preclinical animal studies suggest that the reinforcing effects of cocaine are intimately related to mobilization of appetitive hormones or activation of their receptors. In these studies, administration of drugs that selectively activate appetitive hormone receptors reduce cocaine intake and reward. For example, cocaine infusion increases DA signaling in the NAcc (Fortin & Roitman, 2018). However, direct infusion of Exendin-4 (ex-4), a synthetic agonist for the glucagon-like peptide 1 (GLP-1) receptor, into the NAcc core suppressed cocaine-induced increases in DA, suggesting that stimulation of appetite-reducing pathways also blunts the pharmacological effect of cocaine. Similar findings have been reported for behavioral outcomes of rats’ self-administering cocaine. In rats trained to self-administer intravenous cocaine, infusions of ex-4 into the VTA dose-dependently reduced lever pressing for cocaine. Importantly, ex-4 had no effect in rats responding for sucrose infusions, suggesting that the response to GLP-1 activation is selective to cocaine-induced DA increases (Schmidt et al., 2016). Furthermore, in another group of rats injected with corticosterone into the fourth ventricle, which decreased responding for cocaine, the synthetic GLP-1 antagonist Exendin-9-39 (ex-9) reversed corticosterone-induced decreases in cocaine responding. In other words, GLP-1 antagonism restored responding for cocaine to baseline levels. Finally, when a viral vector was used to reduce GLP-1 receptor expression in the VTA, rats also had increased responding for cocaine. Thus, activation of GLP-1 receptors decreases motivation for cocaine, blockade of GLP-1 receptors normalizes motivation (Bouhlal et al., 2018) for cocaine after corticosterone treatment, and underexpression of GLP-1 is sufficient to increase cocaine-seeking in rats (Schmidt et al., 2016). GLP-1 activation in the NAcc (Hernandez et al., 2019) or the VTA (Hernandez et al., 2018) also dose-dependently reduced cocaine-seeking in rats in a model of cocaine relapse known as cocaine reinstatement. From these data, it is clear that the neural circuits controlling rewarding effects of cocaine can be modulated by GLP-1 activity.

Appetitive hormone action can also be manipulated through altering daily diet of animals to determine effects on response to cocaine. For example, female rats fed a high-fat diet for 4 weeks showed increased sensitivity to cocaine even during the first week of the high-fat diet and continuing through the duration of the high-fat diet (Baladi et al., 2012). Similar effects have been observed in male rats, although they appear most sensitized to cocaine following a high-fat diet during adolescence (Baladi et al., 2015). Ghrelin has been implicated as a possible mechanism in the reinforcing effects of cocaine following high-fat diets. Mice fed a high-fat binge diet for 2 hours/day across 18 days during adolescence showed increased cocaine self-administration, which was correlated with increased expression of ghrelin receptors in the VTA in postmortem analyses (Blanco-Gandía et al., 2017). Thus, not only does diet influence sensitivity to cocaine, but also it appears to do so during adolescence, when the brain is continuing to grow and recreational drug use is likely to begin in human populations.

In addition to appetitive hormones’ modulating cocaine’s effects, cocaine administration can alter levels of appetitive hormones. In healthy cocaine users, intravenous cocaine administration decreased blood levels of GLP-1 and partially decreased levels of insulin and amylin (Bouhlal et al., 2017). Leptin—an appetite-suppressing hormone—also appears to be affected by cocaine, such that cocaine users show elevated blood leptin levels following a 14-day abstinence from cocaine use (Martinotti et al., 2017). Similar effects have been reported after cocaine self-administration in rats (You et al., 2019). Furthermore, replacing cocaine infusions with saline still elicited increases or decreases in appetitive hormones, including ghrelin, GLP-1, and insulin 15 days after termination of cocaine administration, suggesting that these hormones are sensitive to cocaine as well as expectancy cues of cocaine. Together, these data suggest a bidirectional effect of cocaine on appetitive hormones in which activation of these hormone receptors can modulate cocaine intake, and cocaine intake can modulate circulating blood levels of these hormones. Drugs that target appetitive hormone receptors are being considered as treatments of stimulant drug abuse (Engel & Jerlhag, 2014).


Amphetamine-type stimulants, including amphetamine sulfate and methamphetamine, are synthetically produced drugs that, like cocaine, increase DA signaling in the NAcc. The behavioral effects of amphetamines are similar to those of cocaine, because both are DA reuptake inhibitors. However, amphetamines are also DA-releasing agents, which contributes to their relatively longer time course than cocaine (Chiu & Schenk, 2012). In addition to DA, amphetamines inhibit GABAergic signaling, which is proposed to contribute strongly to the withdrawal syndrome following repeated amphetamine use (Jiao et al., 2015).

Amphetamines have mixed effects on the HPA axis that are dependent on various factors, including sex, frequency of exposure, and even psychopathology. For example, acute methamphetamine exposure activates the PVN and amygdala similarly in male and female mice; however, females have elevated plasma corticosterone levels for longer periods (Zuloaga et al., 2014). Whereas acute exposure to amphetamines activates the HPA axis, chronic use sensitizes the stress response (Barr et al., 2002). Interestingly, amphetamine withdrawal has no effect on corticosterone and ACTH levels in rats following prolonged restraint, although sensitization to behavioral measures of stress, for example the forced swim test, has been reported (Russig et al., 2006). In humans, childhood trauma and depression are better predictors of increased basal cortisol levels than amphetamine dependence alone, suggesting that the effects of amphetamine use on stress may be augmented in mental disorders (Pirnia et al., 2020).


Nicotine is a psychostimulant that produces its psychoactive effects primarily by binding to nicotinic acetylcholine receptors that contain the ß2 receptor subunit in the VTA (Picciotto et al., 1998). Stimulation of these receptors leads to DA release in the mesolimbic pathway, resulting in a downstream reward mechanism similar to that of cocaine (Picciotto et al., 1998). It is perhaps of no surprise, then, that nicotine remains one of the most abused drugs in the United States, with approximately 16% of the U.S. population reported to be regular smokers (Jamal et al., 2018). While cigarette use is declining, electronic cigarettes are rising in popularity and represent a mechanism by which individuals can become, or continue being, dependent on nicotine.

One of the first studies to suggest a relation between appetitive hormones and nicotine’s rewarding effects was conducted by Egecioglu et al. (2013). They gave different groups of mice systemic injections of saline, nicotine, or nicotine + ex-4 (an agonist for the GLP-1 receptor), and measured nicotine-induced locomotor activity, conditioned place preference (CPP) for an environment paired with nicotine, and nicotine-induced DA release in the NAcc. As expected, nicotine increased locomotor activity, induced preference for a nicotine-paired environment, and increased NAcc DA release. Similar to its cocaine-blunting effects, ex-4 blocked the effects of nicotine in nicotine-treated mice without producing effects in vehicle-treated mice given ex-4 (Egecioglu et al., 2013). In addition to nicotine’s rewarding properties, high doses of nicotine can also be aversive (Fowler & Kenny, 2014). The habenula—a brain structure near the pineal gland believed to be heavily involved in negative reward (Matsumoto & Hikosaka, 2007)—plays an integral role in nicotine intake (Tuesta et al., 2017). Artificial GLP-1 receptor activation, via designer receptors exclusively activated by designer drugs (DREADDs) and direct infusion of ex-4 into GLP-1 circuits that project to the habenula, decreased nicotine self-administration. Thus, GLP-1 acts exclusively within the brain to reduce nicotine’s rewarding effects. Furthermore, mice lacking GLP-1 receptors showed reduced cFos reactivity in brain areas that receive projections from the habenula following nicotine administration, further highlighting GLP-1 receptor involvement in nicotine’s activation of these neurons (Tuesta et al., 2017). Overall, these studies suggest that GLP-1 receptors within both reward and aversion circuits can drive nicotine’s neurological and behavioral effects.

Orexin/hypocretin is another appetitive hormone thought to mediate nicotine’s rewarding effects. For example, rats trained to self-administer nicotine decrease their nicotine intake in a dose-dependent fashion after direct infusion of a hypocretin-1 receptor (Hcrt-1) antagonist (Hollander et al., 2008). Hcrt-1 antagonism also decreased neuronal reward thresholds in these rats at equivalent doses used to suppress nicotine self-administration, suggesting that hypocretin antagonism decreases the rewarding properties of nicotine. Importantly, this effect of hypocretin is specific to the Hcrt-1 receptor. In mice, Hcrt-1 receptor activation blocks nicotine relapse, but Hcrt-2 receptor activation does not (Plaza-Zabala et al., 2013).

A third commonly used method to investigate appetitive hormone modulation of nicotine’s effects is to study nicotine dependence in rodent models of diabetes. These models (see O’Dell et al., 2014) often use streptozotocin (STZ) administration to deactivate pancreatic beta cells that produce insulin, thereby decreasing insulin production and leading to consistently high blood glucose (i.e., hyperglycemia). STZ-treated rats not only respond more frequently for nicotine infusions across a 10-day span (O’Dell et al., 2014), but also display preference for nicotine-paired environments more readily than control rats (Pipkin et al., 2017). Furthermore, STZ-treated rats also have increased somatic signs of withdrawal during antagonist-precipitated withdrawal from nicotine (Pipkin et al., 2017). These findings indicate that diabetic individuals who are current smokers may be at greater risk of unsuccessful quit attempts from tobacco because of enhanced withdrawal symptoms following nicotine abstinence.

Anxiolytics and Antidepressants


Benzodiazepines are the most commonly prescribed drugs for the treatment of anxiety. Benzodiazepines bind to an allosteric site on GABAA receptors in both the central and peripheral nervous systems (Beurdeley-Thomas et al., 2000). Benzodiazepines produce their therapeutic effects through binding to GABAA receptors located on excitatory neurons in the central nervous system. Once bound, benzodiazepines increase the frequency with which chloride (Cl-) channels open when GABA is bound, thus hyperpolarizing the cell and inhibiting cell firing (Edwards & Preuss, 2020). This mechanism is responsible for the sedative and anxiolytic effects of benzodiazepines.

The relation between the HPA axis and GABA has been well documented in preclinical and clinical research. GABAA receptors are highly innervated throughout the PVN, thus inhibition of the HPA axis occurs primarily through GABAergic mechanisms (Gunn et al., 2015; Miklo & Kovaks, 2002). Acute stress stimulates the production of neurosteroids from the adrenal gland, such as progesterone, deoxycorticosterone, and testosterone and their metabolites, which act as positive allosteric modulators at GABAA (Barbaccia et al., 2001; Morrow et al., 2009; Purdy et al., 1991). Additionally, neurosteroids produced anxiolytic effects in several rodent models of anxiety, further supporting the significance of GABAA-mediated inhibition of the HPA axis (Bitran et al., 1999; Carboni et al., 1996; Crawley et al., 1986; Patchev et al., 1994, 1996). Although benzodiazepines have a binding site on GABAA separate from that of endogenous neurosteroids, they have similar actions through reducing HPA activity.

Like many of the drugs discussed herein, benzodiazepines can lead to dependence if taken chronically. However, abuse of benzodiazepine medications most often occurs concomitantly with other substance use disorders. For instance, benzodiazepines were involved in over 30% of opioid-related overdoses in 2017 (Tori et al., 2020). Additionally, studies show that up to 40% of alcoholics also reported abusing benzodiazepines at one point in time (Uchida et al., 2019). Benzodiazepines, along with opioids and alcohol, increase GABA activity in the central nervous system. One reason why the concurrent use of these drugs occurs so frequently is that they work synergistically to potentiate sedative and psychoactive effects. In other words, when a person develops tolerance to opioids or alcohol, they may then use benzodiazepines to augment the effects of these drugs.

Studies have detailed the potential for benzodiazepines to be used in treatment of addictions to hypnotic drugs by alleviating withdrawal symptoms and reducing the likelihood of relapse. In fact, benzodiazepines are the first line of pharmacologic treatment for the management of alcohol withdrawal and are effective in reducing the seizures and delirium tremens that can result from alcohol abstinence (Dixit et al., 2016). Not all benzodiazepines produce equivalent therapeutic effects in withdrawal states, and drugs with longer and more potent action on GABAA are more effective in attenuating severe withdrawal symptoms (Saitz & O’Malley, 1997). Due to dangerous drug interactions between benzodiazepines and other depressant drugs, they come with a warning label. Risky side effects of long-term benzodiazepine use, including drug interactions, respiratory depression, psychomotor retardation, and memory impairments, have been well documented for decades (Longo & Johnson, 2000). For this reason, physicians are progressively leaning toward prescribing safer alternatives, such as selective serotonin reuptake inhibitors (SSRIs), for the treatment of anxiety disorders.


SSRIs are considered atypical anxiolytics because historically they have been used to treat depression, but it has become more apparent in recent decades that they also produce robust anxiolytic effects (Rausch et al., 2001). As their name suggests, SSRIs produce their effects by blocking serotonergic transporters on presynaptic neurons, thus preventing the uptake of serotonin (5-HT) back into the cell. This increases the availability of 5-HT in the synapse to bind to postsynaptic receptors. The anxiolytic effects of SSRIs have been associated with prolonged action of 5-HT at 5-HT1A autoreceptors (Rahn et al., 2015). Eventually, chronic activation results in a downregulation of the autoreceptors, leading to even longer activity periods of 5-HT on other receptors.

Chronic administration of SSRIs is needed to produce therapeutic effects. Evidence from preclinical and clinical studies suggest that the chronic effects of SSRIs can partly be attributed to modulation of neuropeptides, such as CRH, galanin (GAL), oxytocin (OT), AVP, and neuropeptide Y (NPY; Golyszny & Obuchowicz, 2019). These neuropeptides play a role in many stress-related mood disorders (Kormos & Gaszner, 2013) and are integrated in the serotonergic system. The co-localization of CRH and 5-HT receptors on neurons in the dorsal raphe nucleus (DRN) is believed to be an important link between the pharmacological actions of SSRIs and the stress response (Valentino et al., 2009). However, the relation between CRH receptors and 5-HT activity is complex due to the competing functions of CRH1 and CRH2 receptors. Specifically, CRH1 inhibits the release of 5-HT, while CRH2 has been shown to increase serotonergic signaling. CRH1-deficient mice display a reduction in HPA activity, anxiety-like behaviors, and cognitive function (Timpl et al., 1998). A later study linked this effect with upregulated levels of 5-HT in the raphe-hippocampal system (Peñalva et al., 2002). Conversely, selective activation of CRH2 receptors increased 5-HT levels in projection regions of the DRN in mice (Amat et al., 2004). It is interesting to note that in a study examining forced-swim-induced stress responses in mice, both receptors were implicated in increased 5-HT transmission, suggesting that CRH produces contrasting effects in basal and stress conditions (Linthorst et al., 2002). Furthermore, CRH receptors seem to be differentially activated under acute and repeated stress conditions. Initial exposure to the forced-swim test predominantly activated CRH1 receptors in mice, but subsequent testing in the paradigm shows that CRH binds more preferentially to CRH2 following repeated exposure (Waselus et al., 2009). This distinction in CRH receptor activation may help explain the onset of drug-seeking behavior.

One explanation for the development of substance abuse disorders is that, following acute stress, activation of CRH1 receptors inhibits 5-HT transmission in the DRN, evoking downstream effects that increase impulsive and drug-seeking behavior. In support of this theory, several studies have illustrated the influence of serotonergic function on substance abuse. In rats, decreased levels of 5-HT were associated with increased alcohol preference and consumption (Lemarquand et al., 1994b). Additionally, genetic deletion of CRH1, but not CRH2, in mice prevented binge-drinking behavior in the “drinking in the dark” paradigm, linking CRH1 to the incitement of alcohol abuse (Kaur et al., 2012). Impaired serotonergic function has also been identified as a risk factor for early-onset alcoholism in humans (Johnson, 2000; Lemarquand et al., 1994a). Many drugs, such as cocaine and alcohol, increase 5-HT transmission, so it is possible that some individuals seek out and consequently crave certain substances because they help regulate dysfunction of the 5-HT system.

Given that impairments in serotonergic functioning are linked to drug-seeking, researchers once hypothesized that SSRIs could be effective in treating addiction, but evidence for this has been sparse. Many studies have found no effect of SSRI treatment on altering drug-related behavior. In fact, one clinical study investigating treatments for drinking behavior in individuals with severe alcohol abuse disorder found fluoxetine had less therapeutic effect than placebo (Kranzler et al., 2009). Thus, SSRIs are effective anxiolytics in part due to their influence on the HPA axis, but they appear to be less efficacious in altering existing drug-seeking patterns. More investigation of drugs that augment serotonergic functioning could generate novel pharmacological interventions for substance abuse disorders with improved treatment outcomes.


In addition to drugs of abuse, other environmental stimuli can alter the stress response. For example, blood-borne factors that regulate food intake, metabolism, and energy storage, commonly referred to as appetitive hormones, can become dysregulated in response to stress and can influence behavior. These hormones include insulin, ghrelin, leptin, orexin, NPY, and GLP-1, among others. In addition to the well-documented activity of these hormones in the hypothalamus and peripheral target tissues, several of them (e.g., ghrelin, insulin, and GLP-1) can modulate activity in midbrain reward circuits involved in motivation and reward-seeking (Dickson et al., 2011; Engel & Jerlhag, 2014; Hayes & Schmidt, 2016). For example, the subsets of neurons expressing GLP-1 receptors in the nucleus of the solitary tract (NST), a brainstem structure mediating visceral information, project to the ventral tegmental area (VTA) and NAcc of the midbrain and, when activated, reduce food intake (Alhadeff et al., 2012; Dickson et al., 2012). Similarly, direct blockade of GLP-1 receptors with GLP-1 antagonists in the NAcc increases food intake (Dossat et al., 2011). Furthermore, direct infusion of ghrelin, an appetite-stimulating gastrointestinal hormone, into the VTA or NAcc promotes food intake (Skibicka et al., 2011). These findings are important given that the VTA and NAcc are prominent brain regions in the mesolimbic dopaminergic pathway (Feltenstein & See, 2008) and can help us understand how stress may influence maladaptive dietary choices.

It is well established that stress influences dietary habits (Oliver & Wardle, 1999). However, in the wild, the acute stress response typically discourages food intake by directing energy away from the visceral organs and toward skeletal muscle and the brain to activate the fight or flight response. One key to understanding stress and food intake is the abundant availability of palatable foods in Western culture and how food-mediated activation of reward circuits can alleviate the psychological discomfort that accompanies HPA axis activity (Adam & Epel, 2007). This is typically true in rodents as well. Stress reduces weight gain under normal conditions, but it also increases sucrose consumption (Dallman et al., 2003). The short-term rewarding properties of palatable food are thought to be modulated by cannabinoid and opioid receptor activity in midbrain DA reward circuits (Cota et al., 2006). However, the long-term influences of chronic stress on food intake are most likely due to HPA axis–appetitive hormone interaction affecting satiety signals, along with the short-term rewarding properties of food (Cavagnini et al., 2000). Typically, appetitive hormones like leptin help send satiety signals to the brain to terminate eating. But, during chronic stress, resistance to these satiety signals from appetitive hormones contributes to metabolic diseases like obesity (Björntorp, 2001). Preclinical mouse models also show increased weight gain and fat mass in stress + high-fat diet conditions relative to high-fat diets alone (Ip et al., 2019). Additionally, NPY signaling, an appetite-stimulating factor, can be amplified during stress-promoted eating (Ip et al., 2019). This action of NPY is also thought to be specific to the central nucleus of the amygdala under stressful conditions (Ip et al., 2019). After progression to obesity, these signals can change how the brain perceives stressful and food stimuli. For example, thalamic and hypothalamic activity in response to food and stress cues are higher among obese individuals than in lean individuals (Jastreboff et al., 2013). Furthermore, striatal activity was moderately correlated with within-session food craving in response to a stress cue in obese individuals, with no such association in lean individuals. Thus, the interplay between chronic stress, disrupted appetitive hormone signaling, a lack of dietary suppression, and availability of rewarding foodstuffs can contribute to maladaptive food choice, increasing the likelihood of weight gain and metabolic disease.


The effects of stress on behaviors related to drug use and feeding are reflected by receptor signaling systems and shared neural circuitry. Stress increases release of CRH, which not only induces the HPA stress response but also increases opioid self-administration in humans and other animals. Similarly, because its activation promotes sleep, feeding, and fat deposition while decreasing anxiety, the endocannabinoid system has been theorized to have evolved as a compensatory system that works in opposition of the stress response (Matias & Di Marzo, 2007). Hormone systems related to feeding also affect drug use. For example, blocking GLP-1 receptors decreases stimulant self-administration. Research on the common circuits of stress and drug use not only improves basic knowledge of physiology but also is yielding new targets for treatment of emotional and drug use disorders.