1-20 of 25 Results  for:

  • Cognitive Neuroscience x
Clear all

Article

Aging and Olfaction  

Richard L. Doty

Decreased ability to smell is common in older persons. Some demonstrable smell loss is present in more than 50% of those 65 to 80 years of age, with up to 10% having no smell at all (anosmia). Over the age of 80, 75% exhibit some loss with up to 20% being totally anosmic. The causes of these decrements appear multifactorial and likely include altered intranasal airflow patterns, cumulative damage to the olfactory receptor cells from viruses and other environmental insults, decrements in mucosal metabolizing enzymes, closure of the cribriform plate foramina through which olfactory receptor cells axons project to the brain, loss of selectivity of receptor cells to odorants, and altered neurotransmission, including that exacerbated in some age-related neurodegenerative diseases.

Article

Behavioral, Cognitive, and Neural Mechanisms of Human Social Interaction  

Antonia F. de C. Hamilton

Social interaction is a fundamental part of what makes humans human and draws on a wide range of neural and cognitive mechanisms. This review summarizes research in terms of four suggested brain networks. First, the social perception network responds selectively to viewing and interpreting other people’s faces and bodies. Second, the theory of mind network is engaged when people think about other people’s beliefs and knowledge states. Third, the mirror neuron network has a role in understanding and imitating actions. Fourth, the emotion network shows some selective responses to emotional facial expressions and when people empathize with other’s pain. The role of these four networks in dynamic social interactions and real-world communication is also considered.

Article

Caenorhabditis elegans Learning and Memory  

James S.H. Wong and Catharine H. Rankin

The nematode, Caenorhabditis elegans (C. elegans), is an organism useful for the study of learning and memory at the molecular, cellular, neural circuitry, and behavioral levels. Its genetic tractability, transparency, connectome, and accessibility for in vivo cellular and molecular analyses are a few of the characteristics that make the organism such a powerful system for investigating mechanisms of learning and memory. It is able to learn and remember across many sensory modalities, including mechanosensation, chemosensation, thermosensation, oxygen sensing, and carbon dioxide sensing. C. elegans habituates to mechanosensory stimuli, and shows short-, intermediate-, and long-term memory, and context conditioning for mechanosensory habituation. The organism also displays chemotaxis to various chemicals, such as diacetyl and sodium chloride. This behavior is associated with several forms of learning, including state-dependent learning, classical conditioning, and aversive learning. C. elegans also shows thermotactic learning in which it learns to associate a particular temperature with the presence or absence of food. In addition, both oxygen preference and carbon dioxide avoidance in C. elegans can be altered by experience, indicating that they have memory for the oxygen or carbon dioxide environment they were reared in. Many of the genes found to underlie learning and memory in C. elegans are homologous to genes involved in learning and memory in mammals; two examples are crh-1, which is the C. elegans homolog of the cAMP response element-binding protein (CREB), and glr-1, which encodes an AMPA glutamate receptor subunit. Both of these genes are involved in long-term memory for tap habituation, context conditioning in tap habituation, and chemosensory classical conditioning. C. elegans offers the advantage of having a very small nervous system (302 neurons), thus it is possible to understand what these conserved genes are doing at the level of single identified neurons. As many mechanisms of learning and memory in C. elegans appear to be similar in more complex organisms including humans, research with C. elegans aids our ever-growing understanding of the fundamental mechanisms of learning and memory across the animal kingdom.

Article

Camillo Golgi  

Paolo Mazzarello

Camillo Golgi (1843–1926), a physician and researcher from Lombardy, was a leading figure in Italian science in the second half of the 19th century. His name is linked to several fundamental contributions: the invention of the “black reaction,” a method that made it possible to highlight, for the first time in history, the fine structure of the central nervous system; the discovery of the Golgi apparatus or complex, one of the fundamental components of the cell; the discovery of the perineural net (an extracellular matrix meshwork that wrap around some neurons with important physiological functions); the identification of the Golgi tendon organ (a proprioceptor that senses tension from the muscle); and the description of the malaria plasmodium cycle in the “tertian” and “quartan” forms of the disease with the identification of the correspondence between the multiplication of the parasite and febrile access (Golgi law). These are major scientific contributions that have profoundly changed basic areas of biology and medicine. To these must be added many other minor contributions that alone could have qualified the reputation of any researcher.

Article

Cognitive Neuroscience of Aging  

Jessica A. Bernard and Tracey H. Hicks

The cognitive neuroscience of aging is a large and diverse field seeking to understand cross-sectional differences and longitudinal changes in brain structure and function. Research in this field also investigates how brain differences or changes influence behavior in later life. There is consistent evidence indicating that cross-sectionally the brain is smaller in older adults (OA) relative to younger adults (YA), and this is due to longitudinal change over time. Furthermore, there are differences in functional activation patterns and the functional network architecture in the aging brain, both of which may contribute to the behavioral differences experienced by OA. Most notably, the differences in functional activation patterns suggest that perhaps the aging brain may compensate for the impacts of aging in an attempt to preserve performance. As such, several frameworks for understanding the processes of aging have taken hold resulting in testable hypotheses that link brain function and structure to behavior. Finally, in Alzheimer’s Disease, cognitive neuroscience methodologies have provided additional insights into the impacts of the disease on brain structure, function, and behavior.

Article

Confidence in Decision-Making  

Megan A.K. Peters

The human brain processes noisy information to help make adaptive choices under uncertainty. Accompanying these decisions about incoming evidence is a sense of confidence: a feeling about whether a decision is correct. Confidence typically covaries with the accuracy of decisions, in that higher confidence is associated with higher decisional accuracy. In the laboratory, decision confidence is typically measured by asking participants to make judgments about stimuli or information (type 1 judgments) and then to rate their confidence on a rating scale or by engaging in wagering (type 2 judgments). The correspondence between confidence and accuracy can be quantified in a number of ways, some based on probability theory and signal detection theory. But decision confidence does not always reflect only the probability that a decision is correct; confidence can also reflect many other factors, including other estimates of noise, evidence magnitude, nearby decisions, decision time, and motor movements. Confidence is thought to be computed by a number of brain regions, most notably areas in the prefrontal cortex. And, once computed, confidence can be used to drive other behaviors, such as learning rates or social interaction.

Article

Crossmodal Plasticity, Sensory Experience, and Cognition  

Valeria Vinogradova and Velia Cardin

Crossmodal plasticity occurs when sensory regions of the brain adapt to process sensory inputs from different modalities. This is seen in cases of congenital and early deafness and blindness, where, in the absence of their typical inputs, auditory and visual cortices respond to other sensory information. Crossmodal plasticity in deaf and blind individuals impacts several cognitive processes, including working memory, attention, switching, numerical cognition, and language. Crossmodal plasticity in cognitive domains demonstrates that brain function and cognition are shaped by the interplay between structural connectivity, computational capacities, and early sensory experience.

Article

Deep Neural Networks in Computational Neuroscience  

Tim C. Kietzmann, Patrick McClure, and Nikolaus Kriegeskorte

The goal of computational neuroscience is to find mechanistic explanations of how the nervous system processes information to give rise to cognitive function and behavior. At the heart of the field are its models, that is, mathematical and computational descriptions of the system being studied, which map sensory stimuli to neural responses and/or neural to behavioral responses. These models range from simple to complex. Recently, deep neural networks (DNNs) have come to dominate several domains of artificial intelligence (AI). As the term “neural network” suggests, these models are inspired by biological brains. However, current DNNs neglect many details of biological neural networks. These simplifications contribute to their computational efficiency, enabling them to perform complex feats of intelligence, ranging from perceptual (e.g., visual object and auditory speech recognition) to cognitive tasks (e.g., machine translation), and on to motor control (e.g., playing computer games or controlling a robot arm). In addition to their ability to model complex intelligent behaviors, DNNs excel at predicting neural responses to novel sensory stimuli with accuracies well beyond any other currently available model type. DNNs can have millions of parameters, which are required to capture the domain knowledge needed for successful task performance. Contrary to the intuition that this renders them into impenetrable black boxes, the computational properties of the network units are the result of four directly manipulable elements: input statistics, network structure, functional objective, and learning algorithm. With full access to the activity and connectivity of all units, advanced visualization techniques, and analytic tools to map network representations to neural data, DNNs represent a powerful framework for building task-performing models and will drive substantial insights in computational neuroscience.

Article

Diagnosis and Treatment of Gambling Addiction  

Gemma Mestre-Bach and Marc N. Potenza

Gambling disorder (GD) is a relatively rare psychiatric concern that may carry substantial individual, familial, and societal harms. GD often presents complex challenges, with high prevalence in adolescents and young adults. GD often co-occurs with other psychiatric disorders, complicating treatment. GD has multiple biopsychosocial contributions, with genetic, environmental, and psychological factors implicated. Advances in neuroimaging and neurochemistry offer insights into the neurobiology of GD. GD diagnostic criteria have evolved, although identification often remains challenging given shame, stigma, ambivalence regarding treatment and limited screening. Because many people with GD do not receive treatment, identification (screening and treatment outreach) and therapeutic (behavioral, neuromodulatory, and pharmacological) approaches warrant increased consideration and development..

Article

High-Density Electrophysiological Recordings to Assess the Dynamic Properties of Attention  

Corentin Gaillard and Suliann Ben Hamed

The brain has limited processing capacities. Attention selection processes are continuously shaping humans’ world perception. Understanding the mechanisms underlying such covert cognitive processes requires the combination of psychophysical and electrophysiological investigation methods. This combination allows researchers to describe how individual neurons and neuronal populations encode attentional function. Direct access to neuronal information through innovative electrophysiological approaches, additionally, allows the tracking of covert attention in real time. These converging approaches capture a comprehensive view of attentional function.

Article

Hormones and Animal Communication  

Eliot A. Brenowitz

Animals produce communication signals to attract mates and deter rivals during their breeding season. The coincidence in timing results from the modulation of signaling behavior and neural activity by sex steroid hormones associated with reproduction. Adrenal steroids can influence signaling for aggressive interactions outside the breeding season. Androgenic and estrogenic hormones act on brain circuits that regulate the motivation to produce and respond to signals, the motor production of signals, and the sensory perception of signals. Signal perception, in turn, can stimulate gonadal development.

Article

The Interaction of Perception and Memory  

Emma Megla and Wilma A. Bainbridge

Whereas visual perception is the interpretation of the light that enters the retina of the eye, long-term memory is the encoding, storage, and retrieval of perceptual experiences and learned information. Although these are separable processes, they continuously interact and influence each other. For example, the underlying perceptual features of an image result in large consistency in whether people will remember or forget it, and the visual similarities that images share can influence how well they will be remembered. The exaggeration of visual features, such as enlarged eyes on a face, can lead to enhanced memory, and a buildup in perceptual experience can also improve memory. In addition to perception influencing memory, memory also influences perception. Familiarity with an object or object category can result in enhanced perceptual processing, or even lead to the stimuli “looking” different from how they otherwise would. Additionally, learning a new category of objects changes how we perceive its categorical members, and even members of different, related categories. Perception and memory are closely intertwined in the brain as well, with mechanisms that allow similar perceptual items to be distinguished in memory, but also support incomplete perceptual details to be filled in from memory. Additionally, there are divisions in the brain dedicated to the perceptual and mnemonic processing of different object categories, such as faces and scenes. In other words, there are widespread examples in which memory and perception influence each other, with neural mechanisms and areas set in place to deal with these complex interactions.

Article

Investigating Learning and Memory in Humans  

Evangelia G. Chrysikou, Elizabeth Espinal, and Alexandra E. Kelly

Memory refers to the set of cognitive systems and the neural structures that support them that allow humans to learn from experience, leverage this knowledge to understand and guide behavior in the present, and use past memories to think about and plan for the future. Neuroscience research on learning and memory has leveraged advances in behavioral methods, structural and functional brain imaging, noninvasive brain stimulation, and lesion studies to evaluate synergies and dissociations among small- and large-scale neural networks in support of memory performance. Overall, this work has converged to a conceptualization of new memories as representations of distributed patterns of neural activity across cortical and subcortical brain systems that provide neural grounding of sensorimotor and perceptual experiences, actions, thoughts, and emotions, and which can be reinstated as a result of internal or external cues. Most of this literature has supported dissociations among working and long-term memory, as well as between procedural, episodic, and semantic memories. On the other hand, progress in human neuroscience methodologies has revealed the interdependence of these memory systems in the context of complex cognitive tasks and suggests a dynamic and highly interactive neural architecture underlying human learning and memory. Future neuroscience research is anticipated to focus on understanding the neural mechanisms supporting this interactivity at the cellular and systems levels, as well as investigating the time course of their engagement.

Article

Models of Decision-Making Over Time  

Paul Cisek and David Thura

Making a good decision often takes time, and in general, taking more time improves the chances of making the right choice. During the past several decades, the process of making decisions in time has been described through a class of models in which sensory evidence about choices is accumulated until the total evidence for one of the choices reaches some threshold, at which point commitment is made and movement initiated. Thus, if sensory evidence is weak (and noise in the signal increases the probability of an error), then it takes longer to reach that threshold than if sensory evidence is strong (thus helping filter out the noise). Crucially, the setting of the threshold can be increased to emphasize accuracy or lowered to emphasize speed. Such accumulation-to-bound models have been highly successful in explaining behavior in a very wide range of tasks, from perceptual discrimination to deliberative thinking, and in providing a mechanistic explanation for the observation that neural activity during decision-making tends to build up over time. However, like any model, they have limitations, and recent studies have motivated several important modifications to their basic assumptions. In particular, recent theoretical and experimental work suggests that the process of accumulation favors novel evidence, that the threshold decrease over time, and that the result yields improved decision-making in real, natural situations.

Article

Multisensory Integration and the Perception of Self-Motion  

Kathleen E. Cullen

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.

Article

The Natural Scene Network  

Diane Beck and Dirk B. Walther

Interest in the neural representations of scenes centered first on the idea that the primate visual system evolved in the context of natural scene statistics, but with the advent of functional magnetic resonance imaging, interest turned to scenes as a category of visual representation distinct from that of objects, faces, or bodies. Research comparing such categories revealed a scene network comprised of the parahippocampal place area, the medial place area, and the occipital place area. The network has been linked to a variety of functions, including navigation, categorization, and contextual processing. Moreover, much is known about both the visual representations of scenes within the network as well as its role in and connections to the brain’s semantic system. To fully understand the scene network, however, more work is needed to both break it down into its constituent parts and integrate what is known into a coherent system or systems.

Article

The Neural Basis of Behavioral Sequences in Cortical and Subcortical Circuits  

Katherine E. Conen and Theresa M. Desrochers

Sequences of actions and experiences are a central part of daily life in many species. Sequences consist of a set of ordered steps with a distinct beginning and end. They are defined by the serial order and relationships between items, though not necessarily by precise timing intervals. Sequences can be composed from a wide range of elements, including motor actions, perceptual experiences, memories, complex behaviors, or abstract goals. However, despite this variation, different types of sequences may share common features in neural coding. Examining the neural responses that support sequences is important not only for understanding the sequential behavior in daily life but also for investigating the array of diseases and disorders that impact sequential processes and the impact of therapeutics used to treat them. Research into the neural coding of sequences can be organized into the following broad categories: responses to ordinal position, coding of adjacency and inter-item relationships, boundary responses, and gestalt coding (representation of the sequence as a whole). These features of sequence coding have been linked to changes in firing rate patterns and neuronal oscillations across a range of cortical and subcortical brain areas and may be integrated in the lateral prefrontal cortex. Identification of these coding schemes has laid out an outline for understanding how sequences are represented at a neural level. Expanding from this work, future research faces fundamental questions about how these coding schemes are linked together to generate the complex range of sequential processes that influence cognition and behavior across animal species.

Article

Neural Oscillations in Audiovisual Language and Communication  

Linda Drijvers and Sara Mazzini

How do neural oscillations support human audiovisual language and communication? Considering the rhythmic nature of audiovisual language, in which stimuli from different sensory modalities unfold over time, neural oscillations represent an ideal candidate to investigate how audiovisual language is processed in the brain. Modulations of oscillatory phase and power are thought to support audiovisual language and communication in multiple ways. Neural oscillations synchronize by tracking external rhythmic stimuli or by re-setting their phase to presentation of relevant stimuli, resulting in perceptual benefits. In particular, synchronized neural oscillations have been shown to subserve the processing and the integration of auditory speech, visual speech, and hand gestures. Furthermore, synchronized oscillatory modulations have been studied and reported between brains during social interaction, suggesting that their contribution to audiovisual communication goes beyond the processing of single stimuli and applies to natural, face-to-face communication. There are still some outstanding questions that need to be answered to reach a better understanding of the neural processes supporting audiovisual language and communication. In particular, it is not entirely clear yet how the multitude of signals encountered during audiovisual communication are combined into a coherent percept and how this is affected during real-world dyadic interactions. In order to address these outstanding questions, it is fundamental to consider language as a multimodal phenomenon, involving the processing of multiple stimuli unfolding at different rhythms over time, and to study language in its natural context: social interaction. Other outstanding questions could be addressed by implementing novel techniques (such as rapid invisible frequency tagging, dual-electroencephalography, or multi-brain stimulation) and analysis methods (e.g., using temporal response functions) to better understand the relationship between oscillatory dynamics and efficient audiovisual communication.

Article

Neural Processing of Speech Using Intracranial Electroencephalography: Sound Representations in the Auditory Cortex  

Liberty S. Hamilton

When people listen to speech and other natural sounds, their brains must take in a noisy acoustic signal and transform it into a robust mapping that eventually helps them communicate and understand the world around them. People hear what was said, who said it, and how they said it, and each of these aspects is encoded in brain activity across different auditory regions. Intracranial recordings in patients with epilepsy, also called electrocorticography or stereoelectroencephalography, have provided a unique window into understanding these processes at a high spatiotemporal resolution. These intracranial recordings are typically performed during clinical treatment for drug-resistant epilepsy or to monitor brain function during neurosurgery. The access to direct recordings of activity in the human brain is a benefit of this method, but it comes with important caveats. Research using intracranial recordings has uncovered how the brain represents acoustic information, including frequency, spectrotemporal modulations, and pitch, and how that information progresses to more complex representations, including phonological information, relative pitch, and prosody. In addition, intracranial recordings have been used to uncover the role of attention and context on top-down modification of perceptual information in the brain. Finally, research has shown both overlapping and distinct brain responses for speech and other natural sounds such as music.

Article

Phantom Limbs and Brain Plasticity in Amputees  

Tamar Makin and London Plasticity Lab

Phantom sensations are experienced by almost every person who has lost their hand in adulthood. This mysterious phenomenon spans the full range of bodily sensations, including the sense of touch, temperature, movement, and even the sense of wetness. For a majority of upper-limb amputees, these sensations will also be at times unpleasant, painful, and for some even excruciating to the point of debilitating, causing a serious clinical problem, termed phantom limb pain (PLP). Considering the sensory organs (the receptors in the skin, muscle or tendon) are physically missing, in order to understand the origins of phantom sensations and pain the potential causes must be studied at the level of the nervous system, and the brain in particular. This raises the question of what happens to a fully developed part of the brain that becomes functionally redundant (e.g. the sensorimotor hand area after arm amputation). Relatedly, what happens to the brain representation of a body part that becomes overused (e.g. the intact hand, on which most amputees heavily rely for completing daily tasks)? Classical studies in animals show that the brain territory in primary somatosensory cortex (S1) that was “freed up” due to input loss (hereafter deprivation) becomes activated by other body part representations, those neighboring the deprived cortex. If neural resources in the deprived hand area get redistributed to facilitate the representation of other body parts following amputation, how does this process relate to persistent phantom sensation arising from the amputated hand? Subsequent work in humans, mostly with noninvasive neuroimaging and brain stimulation techniques, have expanded on the initial observations of cortical remapping in two important ways. First, research with humans allows us to study the perceptual consequence of remapping, particularly with regards to phantom sensations and pain. Second, by considering the various compensatory strategies amputees adopt in order to account for their disability, including overuse of their intact hand and learning to use an artificial limb, use-dependent plasticity can also be studied in amputees, as well as its relationship to deprivation-triggered plasticity. Both of these topics are of great clinical value, as these could inform clinicians how to treat PLP, and how to facilitate rehabilitation and prosthesis usage in particular. Moreover, research in humans provides new insight into the role of remapping and persistent representation in facilitating (or hindering) the realization of emerging technologies for artificial limb devices, with special emphasis on the role of embodiment. Together, this research affords a more comprehensive outlook at the functional consequences of cortical remapping in amputees’ primary sensorimotor cortex.