The ability to identify tactile objects depends in part on the perception of their surface microstructure and material properties. Texture perception can, on a first approximation, be described by a number of nameable perceptual axes, such as rough/smooth, hard/soft, sticky/slippery, and warm/cool, which exist within a complex perceptual space. The perception of texture relies on two different neural streams of information: Coarser features, measured in millimeters, are primarily encoded by spatial patterns of activity across one population of tactile nerve fibers, while finer features, down to the micron level, are encoded by finely timed temporal patterns within two other populations of afferents. These two streams of information ascend the somatosensory neuraxis and are eventually combined and further elaborated in the cortex to yield a high-dimensional representation that accounts for our exquisite and stable perception of texture.
Neural Mechanisms of Tactile Texture Perception
Justin D. Lieber and Sliman J. Bensmaia
Neural Mechanisms of Tinnitus
Adam Hockley and Susan E. Shore
Tinnitus is the perception of sound that is independent from an external stimulus. Despite the word tinnitus being derived from the Latin verb for ring, tinnire, it can present as buzzing, hissing, or clicking. Tinnitus is generated centrally in the auditory pathway; however, the neural mechanisms underlying this generation have been disputed for decades. Although it is well accepted that tinnitus is produced by damage to the auditory system by exposure to loud sounds, the level of damage required and how this damage results in tinnitus are unclear. Neural recordings in the auditory brainstem, midbrain, and forebrain of animals with models of tinnitus have revealed increased spontaneous firing rates, capable of being perceived as a sound. There are many proposed mechanisms of how this increase is produced, including spike-timing-dependent plasticity, homeostatic plasticity, central gain, reduced inhibition, thalamocortical dysrhythmia, and increased inflammation. Animal studies are highly useful for testing these potential mechanisms because the noise damage can be carefully titrated and recordings can be made directly from neural populations of interest. These studies have advanced the field greatly; however, the limitations are that the variety of models for tinnitus induction and quantification are not well standardized, which may explain some of the variability seen across studies. Human studies use patients with tinnitus (but an unknown level of cochlear damage) to probe neural mechanisms of tinnitus. They use noninvasive methods, often recoding gross evoked potentials, oscillations, or imaging brain activity to determine if tinnitus sufferers show altered processing of sounds or silence. These studies have also revealed putative neural mechanisms of tinnitus, such as increased delta- or gamma-band cortical activity, altered Bayesian prediction of incoming sound, and changes to limbic system activity. Translation between animal and human studies has allowed some neural correlates of tinnitus to become more widely accepted, which has in turn allowed deeper research into the underlying mechanism of the correlates. As the understanding of neural mechanisms of tinnitus grows, the potential for treatments is also improved, with the ultimate goal being a true treatment for tinnitus perception.
Neural Population Coding of Natural Sounds in Non-flying Mammals
Understanding the principles by which sensory systems represent natural stimuli is one of the holy grails of neuroscience. In the auditory system, the study of the coding of natural sounds has a particular prominence. Indeed, the relationships between neural responses to simple stimuli (usually pure tone bursts)—often used to characterize auditory neurons—and complex sounds (in particular natural sounds) may be complex. Many different classes of natural sounds have been used to study the auditory system. Sound families that researchers have used to good effect in this endeavor include human speech, species-specific vocalizations, an “acoustic biotope” selected in one way or another, and sets of artificial sounds that mimic important features of natural sounds. Peripheral and brainstem representations of natural sounds are relatively well understood. The properties of the peripheral auditory system play a dominant role, and further processing occurs mostly within the frequency channels determined by these properties. At the level of the inferior colliculus, the highest brainstem station, representational complexity increases substantially due to the convergence of multiple processing streams. Undoubtedly, the most explored part of the auditory system, in term of responses to natural sounds, is the primary auditory cortex. In spite of over 50 years of research, there is still no commonly accepted view of the nature of the population code for natural sounds in the auditory cortex. Neurons in the auditory cortex are believed by some to be primarily linear spectro-temporal filters, by others to respond to conjunctions of important sound features, or even to encode perceptual concepts such as “auditory objects.” Whatever the exact mechanism is, many studies consistently report a substantial increase in the variability of the response patterns of cortical neurons to natural sounds. The generation of such variation may be the main contribution of auditory cortex to the coding of natural sounds.
Neural Processing of Pain and Itch
Taylor Follansbee, Mirela Iodi Carstens, and E. Carstens
Pain is defined as “An unpleasant sensory and emotional experience associated with, or resembling that associated with, actual or potential tissue damage,” while itch can be defined as “an unpleasant sensation that evokes the desire to scratch.” These sensations are normally elicited by noxious or pruritic stimuli that excite peripheral sensory neurons connected to spinal circuits and ascending pathways involved in sensory discrimination, emotional aversiveness, and respective motor responses. Specialized molecular receptors expressed by cutaneous nerve endings transduce stimuli into action potentials conducted by C- and Aδ-fiber nociceptors and pruriceptors into the outer lamina of the dorsal horn of the spinal cord. Here, neurons selectively activated by nociceptors, or by convergent input from nociceptors, pruriceptors, and often mechanoreceptors, transmit signals to ascending spinothalamic and spinoparabrachial pathways. The spinal circuitry for itch requires interneurons expressing gastrin-releasing peptide and its receptor, while spinal pain circuitry involves other excitatory neuropeptides; both itch and pain are transmitted by ascending pathways that express the receptor for substance P. Spinal itch- and pain-transmitting circuitry is segmentally modulated by inhibitory interneurons expressing dynorphin, GABA, and glycine, which mediate the antinociceptive and antipruritic effects of noxious counterstimulation. Spinal circuits are also under descending modulation from the brainstem rostral ventromedial medulla. Opioids like morphine inhibit spinal pain-transmitting circuits segmentally and via descending inhibitory pathways, while having the opposite effect on itch. The supraspinal targets of ascending pain and itch pathways exhibit extensive overlap and include the somatosensory thalamus, parabrachial nucleus, amygdala, periaqueductal gray, and somatosensory, anterior cingulate, insular, and supplementary motor cortical areas. Following tissue injury, enhanced pain is evoked near the injury (primary hyperalgesia) due to release of inflammatory mediators that sensitize nociceptors. Within a larger surrounding area of secondary hyperalgesia, innocuous mechanical stimuli elicit pain (allodynia) due to central sensitization of pain pathways. Pruriceptors can also become sensitized in pathophysiological conditions, such as dermatitis. Under chronic itch conditions, low-threshold tactile stimulation can elicit itch (alloknesis), presumably due to central sensitization of itch pathways, although this has not been extensively studied. There is considerable overlap in pain- and itch-signaling pathways and it remains unclear how these sensations are discriminated. Specificity theory states that itch and pain are separate sensations with their own distinct pathways (“labeled lines”). Selectivity theory is similar but incorporates the observation that pruriceptive neurons are also excited by algogenic stimuli that inhibit spinal itch transmission. In contrast, intensity theory states that itch is signaled by low firing rates, and pain by high firing rates, in a common sensory pathway. Finally, the spatial contrast theory proposes that itch is elicited by focal activation of a few nociceptors while activation of more nociceptors over a larger area elicits pain. There is evidence supporting each theory, and it remains to be determined how the nervous system distinguishes between pain and itch.
Neural Processing of Speech Using Intracranial Electroencephalography: Sound Representations in the Auditory Cortex
Liberty S. Hamilton
When people listen to speech and other natural sounds, their brains must take in a noisy acoustic signal and transform it into a robust mapping that eventually helps them communicate and understand the world around them. People hear what was said, who said it, and how they said it, and each of these aspects is encoded in brain activity across different auditory regions. Intracranial recordings in patients with epilepsy, also called electrocorticography or stereoelectroencephalography, have provided a unique window into understanding these processes at a high spatiotemporal resolution. These intracranial recordings are typically performed during clinical treatment for drug-resistant epilepsy or to monitor brain function during neurosurgery. The access to direct recordings of activity in the human brain is a benefit of this method, but it comes with important caveats. Research using intracranial recordings has uncovered how the brain represents acoustic information, including frequency, spectrotemporal modulations, and pitch, and how that information progresses to more complex representations, including phonological information, relative pitch, and prosody. In addition, intracranial recordings have been used to uncover the role of attention and context on top-down modification of perceptual information in the brain. Finally, research has shown both overlapping and distinct brain responses for speech and other natural sounds such as music.
Neural Processing of Taste Information
Alfredo Fontanini and Lindsey Czarnecki
The gustatory system has evolved to detect molecules dissolved into the saliva. It is responsible for the perception of taste and flavor, for mediating the interaction between perception and internal homoeostatic states, and for driving ingestive decisions. The widely recognized five basic taste categories (sweet, salty, bitter, sour, and umami) provide information about the nutritional or potentially harmful content in what is being consumed. Sweetness is typical of sugars that are carbohydrate dense; saltiness is the percept of ions which are necessary for physiological function and electrolytic homeostasis; bitterness is associated with alkaloids and other potential toxins; sourness is the percept of acidity signaling spoiling foods; and umami is the sensation associated with amino acids in protein-rich foods. In addition to taste, the act of eating also engages sensations of temperature, texture, and odor—the integration of all these sensations leads to the unitary percept of flavor. These same senses, and others such as vision and audition, are also engaged before an ingestive event. Sights, sounds, and smells can alert organisms to the presence of food as well as inform the organism as to the specifics of which taste(s) to expect. As such, the neurophysiology of taste is necessarily intertwined with that of other senses and with that of cognitive and homeostatic systems.
Neurobiology of Auditory Hallucinations
Judith M. Ford, Holly K. Hamilton, and Alison Boos
Auditory verbal hallucinations (AVH), also referred to as “hearing voices,” are vivid perceptions of speech that occur in the absence of any corresponding external stimulus but seem very real to the voice hearer. They are experienced by the majority of people with schizophrenia, less frequently in other psychiatric and neurological conditions, and are relatively rare in the general population. Because antipsychotic medications are not always successful in reducing the severity or frequency of AVH, a better understanding is needed of their neurobiological basis, which may ultimately lead to more precise treatment targets. What voices say and how the voices sound, or their phenomenology, varies widely within and across groups of people who hear them. In help-seeking populations, such as people with schizophrenia, the voices tend to be threatening and menacing, typically spoken in a non-self-voice, often commenting and sometimes commanding the voice hearers to do things they would not otherwise do. In psychotic populations, voices differ from normal inner speech by being unbidden and unintended, co-opting the voice hearer’s attention. In healthy voice-hearing populations, voices are not typically distressing nor disabling, and are sometimes comforting and reassuring. Regardless of content and valence, voices tend to activate some speech and language areas of the brain. Efforts to silence these brain areas with neurostimulation have had mixed success in reducing the frequency and salience of voices. Progress with this treatment approach would likely benefit from more precise anatomical targets and more precisely dosed neurostimulation. Neural mechanisms that may underpin the experience of voices are being actively investigated and include mechanisms enabling context-based predictions and distinctions between experiences coming from self and other. Both these mechanisms can be studied in non-human animal “models” and both can provide new anatomical targets for neurostimulation.
Neurobiology of Obstructive Sleep Apnea
Steven Holfinger, M. Melanie Lyons, Nitin Bhatt, and Ulysses Magalang
Obstructive sleep apnea is recognized as a heterogeneous disease presenting with varying underlying risk factors, phenotypes, and responses to therapy. This clinical variance is in part due to the complex pathophysiology of sleep apnea. While multiple anatomical issues can predispose to the development of sleep apnea, factors that control the airway musculature also contribute via different pathophysiologic mechanisms. As sleep apnea does not occur during wakefulness, the impact of sleep stages on respiration is of critical importance. Altogether, understanding sleep apnea pathophysiology helps to guide current treatment modalities and helps identify potential targets for future therapies.
Neuroinflammation and Neuroplasticity in Pain
Much progress has been made in unraveling the mechanisms that underlie the transition from acute to chronic pain. Traditional beliefs are being replaced by novel, more powerful concepts that consider the mutual interplay of neuronal and non-neuronal cells in the nervous system during the pathogenesis of chronic pain. The new focus is on the role of neuroinflammation for neuroplasticity in nociceptive pathways and for the generation, amplification, and mislocation of pain. The latest insights are reviewed here and provide a basis for understanding the interdependence of chronic pain and its comorbidities. The new concepts will guide the search for future therapies to prevent and reverse chronic pain. Long-term changes in the properties and functions of nerve cells, including changes in synaptic strength, membrane excitability, and the effects of inhibitory neurotransmitters, can result from a wide variety of conditions. In the nociceptive system, painful stimuli, peripheral inflammation, nerve injuries, the use of or withdrawal from opioids—all can lead to enhanced pain sensitivity, to the generation of pain, and/or to the spread of pain to unaffected sites of the body. Non-neuronal cells, especially microglia and astrocytes, contribute to changes in nociceptive processing. Recent studies revealed not only that glial cells support neuroplasticity but also that their activation can trigger long-term changes in the nociceptive system.
Nociceptors and Chronic Pain
Edgar T. Walters
Chronic pain lasting months or longer is very common, poorly treated, and sometimes devastating. Nociceptors are sensory neurons that usually are silent unless activated by tissue damage or inflammation. In humans their peripheral activation evokes conscious pain, and their spontaneous activity is highly correlated with spontaneous pain. Persistently hyperactive nociceptors mediate increased responses to normally painful stimuli (hyperalgesia) in chronic conditions and promote the sensitization of central pain pathways that allows low-threshold mechanoreceptors to elicit painful responses to innocuous stimuli (allodynia). Investigations of rodent models of neuropathic pain and hyperalgesic priming have revealed many alterations in nociceptors and associated cells that are implicated in the development and maintenance of chronic pain. These include chronic nociceptor hyperexcitability and spontaneous activity, sprouting, synaptic plasticity, changes in intracellular signaling, and modified responses to opioids, along with alterations in the expression and translation of thousands of genes in nociceptors and closely linked cells.
Daniel W. Wesson, Sang Eun Ryu, and Hillary L. Cansler
The perception of odors exerts powerful influences on moods, decisions, and actions. Indeed, odor perception is a major driving force underlying some of the most important human behaviors. How is it that the simple inhalation of airborne molecules can exert such strong effects on complex aspects of human functions? Certainly, just like in the case of vision and audition, the perception of odors is dictated by the ability to transduce environmental information into an electrical “code” for the brain to use. However, the use of that information, including whether or not the information is used at all, is governed strongly by many emotional and cognitive factors, including learning and experiences, as well as states of arousal and attention. Understanding the manners whereby these factors regulate both the perception of odors and how an individual responds to those percepts are paramount for appreciating the orchestration of behavior.
Pain and Its Modulation
Sensory perceptions are inherently subjective, being influenced by factors such as expectation, attention, affect, and past experiences. Nowhere is this more commonly experienced than with the perception of pain, whose perceived intensity and emotional impact can fluctuate rapidly. The perception of pain in response to the same nociceptive signal can also vary substantially between individuals. Pain is not only a sensory experience. It also involves profound affective and cognitive dimensions, reflecting the activation of and interactions among multiple brain regions. The modulation of pain perception by such interactions has been most extensively characterized in the context of the “descending pain modulatory system.” This system includes a variety of pathways that directly or indirectly modulate the activity of neurons in the spinal dorsal horn, the second-order neurons that receive inputs directly from nociceptors. Less understood are the interactions among brain regions that modulate the affective and cognitive aspects of pain perception. Emerging data suggest that certain pain conditions result from dysfunction in pain modulation, suggesting that targeting these dysfunctions might have therapeutic value. Some therapies that are thought to target pain modulation pathways—such as cognitive behavior therapy, mindfulness-based stress reduction, and placebo analgesia—are safer and less expensive than pharmacologic or surgical approaches, further emphasizing the importance of understanding these modulatory mechanisms. Understanding the mechanisms through which pain modulation functions may also illuminate fundamental mechanisms of perception and consciousness.
Phantom Limbs and Brain Plasticity in Amputees
Tamar Makin and London Plasticity Lab
Phantom sensations are experienced by almost every person who has lost their hand in adulthood. This mysterious phenomenon spans the full range of bodily sensations, including the sense of touch, temperature, movement, and even the sense of wetness. For a majority of upper-limb amputees, these sensations will also be at times unpleasant, painful, and for some even excruciating to the point of debilitating, causing a serious clinical problem, termed phantom limb pain (PLP). Considering the sensory organs (the receptors in the skin, muscle or tendon) are physically missing, in order to understand the origins of phantom sensations and pain the potential causes must be studied at the level of the nervous system, and the brain in particular. This raises the question of what happens to a fully developed part of the brain that becomes functionally redundant (e.g. the sensorimotor hand area after arm amputation). Relatedly, what happens to the brain representation of a body part that becomes overused (e.g. the intact hand, on which most amputees heavily rely for completing daily tasks)? Classical studies in animals show that the brain territory in primary somatosensory cortex (S1) that was “freed up” due to input loss (hereafter deprivation) becomes activated by other body part representations, those neighboring the deprived cortex. If neural resources in the deprived hand area get redistributed to facilitate the representation of other body parts following amputation, how does this process relate to persistent phantom sensation arising from the amputated hand? Subsequent work in humans, mostly with noninvasive neuroimaging and brain stimulation techniques, have expanded on the initial observations of cortical remapping in two important ways. First, research with humans allows us to study the perceptual consequence of remapping, particularly with regards to phantom sensations and pain. Second, by considering the various compensatory strategies amputees adopt in order to account for their disability, including overuse of their intact hand and learning to use an artificial limb, use-dependent plasticity can also be studied in amputees, as well as its relationship to deprivation-triggered plasticity. Both of these topics are of great clinical value, as these could inform clinicians how to treat PLP, and how to facilitate rehabilitation and prosthesis usage in particular. Moreover, research in humans provides new insight into the role of remapping and persistent representation in facilitating (or hindering) the realization of emerging technologies for artificial limb devices, with special emphasis on the role of embodiment. Together, this research affords a more comprehensive outlook at the functional consequences of cortical remapping in amputees’ primary sensorimotor cortex.
Physiology of Color Vision in Primates
Color perception in macaque monkeys and humans depends on the visually evoked activity in three cone photoreceptors and on neuronal post-processing of cone signals. Neuronal post-processing of cone signals occurs in two stages in the pathway from retina to the primary visual cortex. The first stage, in in P (midget) ganglion cells in the retina, is a single-opponent subtractive comparison of the cone signals. The single-opponent computation is then sent to neurons in the Parvocellular layers of the Lateral Geniculate Nucleus (LGN), the main visual nucleus of the thalamus. The second stage of processing of color-related signals is in the primary visual cortex, V1, where multiple comparisons of the single-opponent signals are made. The diversity of neuronal interactions in V1cortex causes the cortical color cells to be subdivided into classes of single-opponent cells and double-opponent cells. Double-opponent cells have visual properties that can be used to explain most of the phenomenology of color perception of surface colors; they respond best to color edges and spatial patterns of color. Single opponent cells, in retina, LGN, and V1, respond to color modulation over their receptive fields and respond best to color modulation over a large area in the visual field.
Plasticity of Information Processing in the Auditory System
Andrew J. King
Information processing in the auditory system shows considerable adaptive plasticity across different timescales. This ranges from very rapid changes in neuronal response properties—on the order of hundreds of milliseconds when the statistics of sounds vary or seconds to minutes when their behavioral relevance is altered—to more gradual changes that are shaped by experience and learning. Many aspects of auditory processing and perception are sculpted by sensory experience during sensitive or critical periods of development. This developmental plasticity underpins the acquisition of language and musical skills, matches neural representations in the brain to the statistics of the acoustic environment, and enables the neural circuits underlying the ability to localize sound to be calibrated by the acoustic consequences of growth-related changes in the anatomy of the body. Although the length of these critical periods depends on the aspect of auditory processing under consideration, varies across species and brain level, and may be extended by experience and other factors, it is generally accepted that the potential for plasticity declines with age. Nevertheless, a substantial degree of plasticity is exhibited in adulthood. This is important for the acquisition of new perceptual skills; facilitates improvements in the detection or discrimination of fine differences in sound properties; and enables the brain to compensate for changes in inputs, including those resulting from hearing loss. In contrast to the plasticity that shapes the developing brain, perceptual learning normally requires the sound attribute in question to be behaviorally relevant and is driven by practice or training on specific tasks. Progress has recently been made in identifying the brain circuits involved and the role of neuromodulators in controlling plasticity, and an understanding of plasticity in the central auditory system is playing an increasingly important role in the treatment of hearing disorders.
The Processing of Hydrodynamic Stimuli With the Fish Lateral Line System
All fish have a mechanosensory lateral line system for the detection of hydrodynamic stimuli. It is thus not surprising that the lateral line system is involved in numerous behaviors, including obstacle avoidance, localization of predators and prey, social communication, and orientation in laminar and turbulent flows. The sensory units of the lateral line system are the neuromasts, which occur freestanding on the skin (superficial neuromasts) and within subdermal canals (canal neuromasts). The canals are in contact with the surrounding water through a series of canal pores. Neuromasts consist of a patch of sensory hair cells covered by a gelatinous cupula. Water flow causes cupula motion, which in turn leads to a change in the hair cells’ receptor potentials and a subsequent change in the firing rate of the innervating afferent nerve fibers. These fibers encode velocity, direction, and vorticity of water motions by means of spike trains. They project predominantly to lateral line neurons in the brainstem for further processing of the received hydrodynamic signals. From the brainstem, lateral line information is transferred to the cerebellum and to midbrain and forebrain nuclei, where lateral line information is integrated with information from other sensory modalities to create a three-dimensional image of the hydrodynamic world surrounding the animal. For fish to determine spatial location and identity of a wave source as well as direction and velocity of water movements, the lateral line system must analyze the various types of hydrodynamic stimuli that fish are exposed to in their natural habitat. Natural hydrodynamic stimuli include oscillatory water motions generated by stationary vibratory sources, such as by small crustaceans; complex water motions produced by animate or inanimate moving objects, such as by swimming fish; bulk water flow in rivers and streams; and water flow containing vortices generated at the edges of objects in a water flow. To uncover the mechanisms that underlie the coding of hydrodynamic information by the lateral line system, neurophysiological experiments have been performed at the level of the primary afferent nerve fibers, but also in the central nervous system, predominantly in the brainstem and midbrain, using sinusoidally vibrating spheres, moving objects, vortex rings, bulk water flow, and Kármán vortex streets as wave sources. Unravelling these mechanisms is fundamental to understanding how the fish brain uses hydrodynamic information to adequately guide behavior.
Mindaugas Mitkus, Simon Potier, Graham R. Martin, Olivier Duriez, and Almut Kelber
Diurnal raptors (birds of the orders Accipitriformes and Falconiformes), renowned for their extraordinarily sharp eyesight, have fascinated humans for centuries. The high visual acuity in some raptor species is possible due to their large eyes, both in relative and absolute terms, and a high density of cone photoreceptors. Some large raptors, such as wedge-tailed eagles and the Old World vultures, have visual acuities twice as high as humans and six times as high as ostriches—the animals with the largest terrestrial eyes. The raptor retina has rods, double cones, and four spectral types of single cones. The highest density of single cones occurs in one or two specialized retinal regions: the foveae, where, at least in some species, rods and double cones are absent. The deep central fovea allows for the highest acuity in the lateral visual field that is probably used for detecting prey from a large distance. Pursuit-hunting raptors have a second, shallower, temporal fovea that allows for sharp vision in the frontal field of view. Scavenging carrion eaters do not possess a temporal fovea that may indicate different needs in foraging behavior. Moreover, pursuit-hunting and scavenging raptors also differ in configuration of visual fields, with a more extensive field of view in scavengers. The eyes of diurnal raptors, unlike those of most other birds, are not very sensitive to ultraviolet light, which is strongly absorbed by their cornea and lens. As a result of the low density of rods, and the narrow and densely packed single cones in the central fovea, the visual performance of diurnal raptors drops dramatically as light levels decrease. These and other visual properties underpin prey detection and pursuit and show how these birds’ vision is adapted to make them successful diurnal predators.
Retinal Mechanisms for Motion Detection
Mathew T. Summers, Malak El Quessny, and Marla B. Feller
Motion is a key feature of the sensory experience of visual animals. The mammalian retina has evolved a number of diverse motion sensors to detect and parse visual motion into behaviorally relevant neural signals. Extensive work has identified retinal outputs encoding directional and nondirectional motion, and the intermediate circuitry underlying this tuning. Detailed circuit mechanism investigation has established retinal direction selectivity in particular as a model system of neural computation.
Sensing Polarized Light in Insects
Thomas F. Mathejczyk and Mathias F. Wernet
Evolution has produced vast morphological and behavioral diversity amongst insects, including very successful adaptations to a diverse range of ecological niches spanning the invasion of the sky by flying insects, the crawling lifestyle on (or below) the earth, and the (semi-)aquatic life on (or below) the water surface. Developing the ability to extract a maximal amount of useful information from their environment was crucial for ensuring the survival of many insect species. Navigating insects rely heavily on a combination of different visual and non-visual cues to reliably orient under a wide spectrum of environmental conditions while avoiding predators. The pattern of linearly polarized skylight that results from scattering of sunlight in the atmosphere is one important navigational cue that many insects can detect. Here we summarize progress made toward understanding how different insect species sense polarized light. First, we present behavioral studies with “true” insect navigators (central-place foragers, like honeybees or desert ants), as well as insects that rely on polarized light to improve more “basic” orientation skills (like dung beetles). Second, we provide an overview over the anatomical basis of the polarized light detection system that these insects use, as well as the underlying neural circuitry. Third, we emphasize the importance of physiological studies (electrophysiology, as well as genetically encoded activity indicators, in Drosophila) for understanding both the structure and function of polarized light circuitry in the insect brain. We also discuss the importance of an alternative source of polarized light that can be detected by many insects: linearly polarized light reflected off shiny surfaces like water represents an important environmental factor, yet the anatomy and physiology of underlying circuits remain incompletely understood.
Sensing the Environment With Whiskers
Mathew H. Evans, Michaela S.E. Loft, Dario Campagner, and Rasmus S. Petersen
Whiskers (vibrissae) are prominent on the snout of many mammals, both terrestrial and aquatic. The defining feature of whiskers is that they are rooted in large follicles with dense sensory innervation, surrounded by doughnut-shaped blood sinuses. Some species, including rats and mice, have elaborate muscular control of their whiskers and explore their environment by making rhythmic back-and-forth “whisking” movements. Whisking movements are purposefully modulated according to specific behavioral goals (“active sensing”). The basic whisking rhythm is controlled by a premotor complex in the intermediate reticular formation. Primary whisker neurons (PWNs), with cell bodies in the trigeminal ganglion, innervate several classes of mechanoreceptive nerve endings in the whisker follicle. Mechanotransduction involving Piezo2 ion channels establishes the fundamental physical signals that the whiskers communicate to the brain. PWN spikes are triggered by mechanical forces associated with both the whisking motion itself and whisker-object contact. Whisking is associated with inertial and muscle contraction forces that drive PWN activity. Whisker-object contact causes whiskers to bend, and PWN activity is driven primarily by the associated rotatory force (“bending moment”). Sensory signals from the PWNs are routed to many parts of the hindbrain, midbrain, and forebrain. Parallel ascending pathways transmit information about whisker forces to sensorimotor cortex. At each brainstem, thalamic, and cortical level of these pathways, there are one or more maps of the whisker array, consisting of cell clusters (“barrels” in the primary somatosensory cortex) whose spatial arrangement precisely mirrors that of the whiskers on the snout. However, the overall architecture of the whisker-responsive regions of the brain system is best characterized by multilevel sensory-motor feedback loops. Its intriguing biology, in combination with advantageous properties as a model sensory system, has made the whisker system the platform for seminal insights into brain function.