The process of brain development begins shortly after conception and in humans takes decades to complete. Indeed, it has been argued that brain development occurs over the lifespan. A complex genetic blueprint provides the intricate details of the process of brain construction. Additional operational instructions that control gene and protein expression are derived from experience, and these operational instructions allow an individual to meet and uniquely adapt to the environmental demands they face. The science of epigenetics provides an explanation of how an individual’s experience adds a layer of instruction to the existing DNA that ultimately controls the phenotypic expression of that individual and can contribute to gene and protein expression in their children, grandchildren, and ensuing generations. Experiences that contribute to alterations in gene expression include gonadal hormones, diet, toxic stress, microbiota, and positive nurturing relationships, to name but a few. There are seven phases of brain development and each phase is defined by timing and purpose. As the brain proceeds through these genetically predetermined steps, various experiences have the potential to alter its final form and behavioral output. Brain plasticity refers to the brain’s ability to change in response to environmental cues or demands. Sensitive periods in brain development are times during which a part of the brain is particularly malleable and dependent on the occurrence of specific experiences in order for the brain to tune its connections and optimize its function. These periods open at different time points for various brain regions and the closing of a sensitive period is dependent on the development of inhibitory circuitry. Some experiences have negative consequences for brain development, whereas other experiences promote positive outcomes. It is the accumulation of these experiences that shape the brain and determine the behavioral outcomes for an individual.
Robert J. McDonald and Ellen G. Fraser
One view of the organization of learning and memory functions in the mammalian brain is that there are multiple learning and memory networks that acquire and store different kinds of information. Each neural network is thought to have a central structure. The hippocampus, amygdala, perirhinal cortex, and dorsal striatum are thought to be central structures for different learning and memory networks important for spatial/relational, emotional, visual objects, and instrumental memory respectively. These central structures are part of a complex network including cortical and subcortical brain regions containing areas important for sensory, motivational, modulatory, and output functions.
These networks are thought to encode and store information obtained during experiences via a general plasticity mechanism in which the relationship between synapses in these regions are changed. This view suggests that that memory has a physical manifestation in the brain, which allows for synapses to communicate more effectively as a result of activation. One form of synaptic plasticity called long-term potentiation (LTP) is considered a fundamental form of changes in synaptic efficacy mediating learning and long-term memory functions.
One of the biochemical mechanisms for initiating LTP is triggered when a type of glutamate receptor, N-methyl-D-aspartate receptor (NMDAR), found in all of these memory networks is activated and various biochemical pathways that can produce long-term enhancements to the efficacy of that synapse are recruited.
NMDAR-mediated LTP processes appear to be important for learning and memory processes in these different networks, but there are clear differences. None of the networks require NMDAR functions during expression of new learning. All the networks required NMDAR function during encoding of new information, except the network centered on perirhinal cortex. Finally, all of the networks required NMDAR-mediated plasticity processes for long-term consolidation of new information, except the one centered on the amygdala.
The sensation of vision arises from the detection of photons of light at the eye, but in order to produce the percept of the world, extensive regions of the brain are required to process the visual information. The majority of information entering the brain via the optic nerve from the eye projects via the lateral geniculate nucleus (LGN) of the thalamus to the primary visual cortex, the largest visual area, having been reorganized such that one side of the brain represents one side of the world.
Damage to the primary visual cortex in one hemisphere therefore leads to a loss of conscious vision on the opposite side of the world, known as hemianopia. Despite this cortical blindness, many patients are still able to detect visual stimuli that are presented in the blind region if forced to guess whether a stimulus is present or absent. This is known as “blindsight.” For patients to gain any information (conscious or unconscious) about the visual world, the input from the eye must be processed by the brain. Indeed, there is considerable evidence from functional brain imaging that several visual areas continue to respond to visual stimuli presented within the blind region, even when the patient is unaware of the stimulus. Furthermore, the use of diffusion imaging allows the microstructure of white matter pathways within the visual system to be examined to see whether they are damaged or intact. By comparing patients who have hemianopia with and without blindsight it is possible to determine the pathways that are linked to blindsight function. Through understanding the brain areas and pathways that underlie blindsight in humans and non-human primates, the aim is to use modern neuroscience to guide rehabilitation programs for use after stroke.
Healthy and Pathological Neurocognitive Aging: Spectral and Functional Connectivity Analyses Using Magnetoencephalography
Gianluca Susi, Jaisalmer de Frutos-Lucas, Guiomar Niso, Su Miao Ye-Chen, Luis Antón Toro, Brenda Nadia Chino Vilca, and Fernando Maestú
Oscillatory activity present in brain signals reflects the underlying time-varying electrical discharges within and between ensembles of neurons. Among the variety of non-invasive techniques available for measuring of the brain’s oscillatory activity, magnetoencephalography (MEG) presents a remarkable combination of spatial and temporal resolution, and can be used in resting-state or task-based studies, depending on the goals of the experiment.
Two important kinds of analysis can be carried out with the MEG signal: spectral a. and functional connectivity (FC) a. While the former provides information on the distribution of the frequency content within distinct brain areas, FC tells us about the dependence or interaction between the signals stemming from two (or among many) different brain areas.
The large frequency range combined with the good resolution offered by MEG makes MEG-based spectral and FC analyses able to highlight distinct patterns of neurophysiological alterations during the aging process in both healthy and pathological conditions. Since disruption in spectral content and functional interactions between brain areas could be accounted for by early neuropathological changes, MEG could represent a useful tool to unveil neurobiological mechanisms related to the cognitive decline observed during aging, particularly suitable for the detection of functional alterations, and then for the discovery of potential biomarkers in case of pathology.
The aging process is characterized by alterations in the spectral content across the brain. At the network level, FC studies reveal that older adults experience a series of changes that make them more vulnerable to cognitive interferences.
While special attention has been dedicated to the study of pathological conditions (in particular, mild cognitive impairment and Alzheimer’s disease), the lack of studies addressing the features of FC in healthy aging is noteworthy. This area of research calls for future attention because it is able to set the baseline from which to draw comparisons with different pathological conditions.
Shellie-Anne T. Levy and Glenn E. Smith
Dementia, also now known as major neurocognitive disorder, is a syndrome involving decline in two or more areas of cognitive function sufficient to disrupt a person’s daily function. Mild cognitive impairment (MCI), also known as minor neurocognitive disorder, represents a syndrome on the continuum of cognitive decline that is a stage prior to development of functional deficits. It involves decline in one or more areas of cognitive function with independence in instrumental activities of daily living, even though they may require greater effort or compensation on the part of the individual. Neuropsychological assessment of cognition and behavior provides the most powerful biomarkers for MCI and dementia syndromes associated with neurodegenerative diseases. Discrete cognitive and behavioral patterns that occur early in the course of cognitive decline aids in differential clinical diagnosis. Additionally, all diagnostic schemes for dementia syndromes include criteria that require the appraisal of functional status, which tests an individual’s capacity to engage in decision making and carry out activities of daily living independently. Methods for assessing functional status have historically had poor reliability and validity. Nevertheless, in a clinical setting, neuropsychologists rely on a combination of self-report, collateral informants, caregiver questionnaires, and objective performance-based measures to better assess functional status. Revisions to clinical criteria for dementia reflect the adoption of new research diagnostic criteria for neurodegenerative diseases, largely driven by the National Institutes of Aging (NIA) and the Alzheimer’s Association 2011 research criteria for Alzheimer’s disease (AD). The new approach differentiates the syndromic presentations common to most neurodegenerative diseases from the etiologies (AD, LBD, VaD, etc.) based on biomarkers. In the preclinical stage, biomarker abnormalities are present years before clinical symptom manifestation. In mild cognitive impairment stage, there is a report/concern for cognitive change by the patient, informant, or clinician. There is objective cognitive decline from estimated premorbid functioning and preserved independence in functional abilities. In the dementia stage, in the context of impaired functional status, there may be prominent cognitive and behavioral symptoms that may involve impairment in memory, executive function, visuospatial functioning, and language, as well as changes in personality and behavior. The most common dementias are AD, dementia with Lewy bodies (DLB), frontotemporal dementia (FTD), and vascular dementia (VaD). All can follow a trajectory of cognitive decline similar to the aforementioned stages and are associated with neuropathogenic mechanisms that may or may not be distinctive for a particular syndrome. Briefly, Alzheimer’s dementia is associated with accumulation of amyloid plaques and tau neurofibrillary tangles. Lewy body dementias (i.e., Parkinson’s disease dementia and DLB) are characterized by Lewy bodies (alpha-synuclein aggregates) and Lewy neurites in the brainstem, limbic system, and cortical regions; DLB is also associated with diffuse amyloid plaques. Frontotemporal dementia is a conglomerate of syndromes that may overlap and include behavioral variant FTD, semantic dementia, and primary progressive aphasia (PPA). FTD dementia syndromes are marked by frontotemporal lobar degeneration (FTLD) caused by pathophysiological processes involving FTLD-tau, FTLD-TDP, FTLD-FUS, or their combination, as well as beta amyloid. Lastly, vascular dementia is associated with cerebrovascular disease that can include large artery occlusions, microinfarcts, brain hemorrhages, and silent brain infarcts; comorbid AD pathology may lower the threshold for dementia conversion. There is an emerging shift in the field toward exploring prevention strategies for dementia. Given the lack of precision in our language regarding the distinction between dementia syndromes and etiologies, we can reallocate some of our efforts to preventing dementia more broadly rather than intervening on a certain pathology. Research already supports that many individuals have biomarker evidence of brain pathology without showing cognitive impairment or even sufficient levels of pathology in the brain to warrant a diagnosis without ever displaying the clinical syndrome of dementia. That said, building cognitive reserve or resilience through lifestyle and behavioral factors may slow the rate of cognitive decline and prevent the risk of a future dementia epidemic.
Anthony Randal McIntosh
Brain organization can be measured across multiple spatial and temporal scales where each scale affects the other in the emergent functions that are known as cognition. As a complex adaptive system, the interplay of these scales in the brain represents the information that ultimately supports what one thinks and does. The dynamics of these multiscale operations can be quantified with measures of complexity, which are sensitive to the balance between information that is coded in local cell populations and that is captured in the network interactions between populations. This local versus global balance has its foundation in the structural connectivity of the brain, which is then realized through the dynamics of cell populations and their ensuing interactions with other populations. Considering brain function and cognition in this way enables a different perspective on the changes in cognitive function in aging.
Changes in brain signal complexity from childhood to adulthood were assessed in two independent studies. Both showed that maturation is accompanied by an overall increase in signal complexity, which also correlated with more stable and accurate cognitive performance. There was some suggestion that the maximal change occurs in medial posterior cortical areas, which have been considered “network hubs” of the brain. In extending to the study of healthy aging, a scale-dependent change in brain complexity was observed across three independent studies. Healthy aging brings a shift in local and global balance, where more information is coded in local dynamics and less in global interactions. This balance is associated with better cognitive performance and, interestingly, in a more active lifestyle. It also seems that the lack of this shift in local and global balance is predictive of worse cognitive performance and potentially predictive of additional decline indicative of dementia.
Nadeeka N. Dissanayaka
Progressive neurological disorders are incurable disorders with gradual deterioration and impacting patients for life. Two common progressive neurological disorders found in late life are Parkinson’s disease (PD) and motor neuron disease (MND). Psychological complications such as depression and anxiety are prevalent in people living with PD and MND, yet they are underdiagnosed and poorly treated.
PD is classified a Movement Disorder and predominantly characterized by motor symptoms such as tremor, bradykinesia, gait problems and postural instability; however, neuropsychiatric complications such as anxiety and depression are common and contribute poorly to quality of life, even more so than motor disability. The average prevalence of depression in PD suggest 35% and anxiety in PD reports 31%. Depression and anxiety often coexist. Symptoms of depression and anxiety overlap with symptoms of PD, making it difficult to recognize. In PD, daily fluctuations in anxiety and mood disturbances are observed with clear synchronized relationships to wearing off of PD medication in some individuals. Such unique characteristics must be addressed when treating PD depression and anxiety. There is an increase in the evidence base for psychotherapeutic approaches such as cognitive behavior therapy to treat depression and anxiety in PD.
Motor neuron disease (MND) is classified a neuromuscular disease and is characterized by progressive degeneration of upper and lower motor neurons is the primary characteristic of MND. The most common form of MND is Amyotrophic lateral sclerosis (ALS) and the terms ALS and MND are simultaneously used in the literature. Given the short life expectancy (average 4 years), rapid deterioration, paralysis, nonmotor dysfunctions, and resulting incapacity, psychological factors clearly play a major role in MND. Depression and suicide are common psychological concerns in persons with MND. While there is an ALS-specific instrument to assess depression, evaluation of anxiety is poorly studied; although emerging studies suggesting that anxiety is highly prevalent in MND. Unfortunately, there is no substantial evidence-base for the treatment of anxiety and depression in MND.
Caregivers play a major role in the management of progressive neurological diseases. Therefore, evaluating caregiver burden and caregiver psychological health are essential to improve quality of care provided to the patient, as well as to improve quality of life for carers. In progressive neurological diseases, caregiving is often provided by family members and spouses, with professional care at advanced disease. Psychological interventions for PD carers addressing unique characteristics of PD and care needs is required. Heterogeneous clinical features, rapid functional decline, and short trajectory of MND suggest a multidisciplinary framework of carer services including psychological interventions to mitigate MND. A Supportive Care Needs Framework has been recently proposed encompassing practical, informational, social, psychological, physical, emotional, and spiritual needs of both MND patients and carers.
David Bunce and Sarah Bauermeister
Intraindividual variability in the present context refers to the moment-to-moment variation in attentional or executive engagement over a given time period. Typically, it is measured using the response latencies collected across the trials of a behavioral neurocognitive task. In aging research, the measure has received a lot of recent interest as it may provide important insights into age-related cognitive decline and neuropathology as well as having potential as a neurocognitive assessment tool in healthcare settings. In the present chapter, we begin by reviewing the key empirical findings relating to age and intraindividual variability. Here, research shows that intraindividual variability increases with age and predicts a range of age-related outcomes including gait impairment, falls and errors more broadly, mild cognitive impairment, dementia, and mortality. Brain imaging research suggests that greater variability is associated with age-related or neuropathological changes to a frontal–cingulate–parietal network and that white matter compromise and dopamine depletion may be key underlying mechanisms. We then consider the cognitive and neurobiological theoretical underpinnings of the construct before providing a description of the various methods and metrics that have been used to compute measures of variability – reaction time cut-offs, raw and residualized intraindividual standard deviations, coefficient of variation, ex-Gaussian curve and fast Fourier transformation. A further section considers the range of neurocognitive tasks that have been used to assess intraindividual variability. Broadly, these tasks can be classified on a continuum of cognitive demands as psychomotor, executive control or higher-order cognitive tasks (e.g., episodic memory). Finally, we provide some pointers concerning the pressing issues that future research needs to address in the area. We conclude that the existing body of theoretical and empirical work underlines the potential of intraindividual reaction time variability measures as additions to the neuropsychological test batteries that are used in the early detection of a range of age-related neurocognitive disorders in healthcare settings.
David J. Madden and Zachary A. Monge
Age-related decline occurs in several aspects of fluid, speed-dependent cognition, particularly those related to attention. Empirical research on visual attention has determined that attention-related effects occur across a range of information processing components, including the sensory registration of features, selection of information from working memory, controlling motor responses, and coordinating multiple perceptual and cognitive tasks. Thus, attention is a multifaceted construct that is relevant at virtually all stages of object identification. A fundamental theme of attentional functioning is the interaction between the bottom-up salience of visual features and top-down allocation of processing based on the observer’s goals. An underlying age-related slowing is prominent throughout visual processing stages, which in turn contributes to age-related decline in some aspects of attention, such as the inhibition of irrelevant information and the coordination of multiple tasks. However, some age-related preservation of attentional functioning is also evident, particularly the top-down allocation of attention. Neuroimaging research has identified networks of frontal and parietal brain regions relevant for top-down and bottom-up attentional processing. Disconnection among these networks contributes to an age-related decline in attention, but preservation and perhaps even increased patterns of functional brain activation and connectivity also contribute to preserved attentional functioning.
Benjamin Boller and Sylvie Belleville
Individuals with mild cognitive impairment (MCI) experience cognitive difficulties and many find themselves in a transitional stage between aging and dementia, making this population a suitable target for cognitive intervention. In MCI, not all cognitive functions are impaired and preserved functions can thus be recruited to compensate for the impact of cognitive impairment. Improving cognition may have a tremendous impact on quality of life and help delay the loss of autonomy that comes with dementia. Several studies have reported evidence of cognitive benefits following cognitive intervention in individuals with MCI. Studies that relied on training memory and attentional control have provided the most consistent evidence for cognitive gains. A few studies have investigated the neurophysiological processes by which these training effects occur. More research is needed to draw clear conclusions on the type of brain processes that are engaged in cognitive training and there are insufficient findings regarding transfer to activities of daily life. Results from recent studies using new technologies such as virtual reality provide encouraging evidence of transfer effects to real-life situations.