21-40 of 71 Results  for:

  • Sensory Systems x
Clear all

Article

The neocortex is a part of the forebrain of mammals that is an innovation of mammal-like “reptilian” synapsid ancestors of early mammals. This neocortex emerged from a small region of dorsal cortex that was present in earlier ancestors and is still found in the forebrain of present-day reptiles. Instead of the thick structure of six layers of cells (five layers) and fibers (one layer) of neocortex of mammals, the dorsal cortex was characterized by a single layer of pyramidal neurons and a scattering of small, largely inhibitory neurons. In reptiles, the dorsal cortex is dominated by visual inputs, with outputs that relate to behavior and memory. The thicker neocortex of six layers in early mammals was already divided into a number of functionally specialized zones called cortical areas that were predominantly sensory in function, while relating to important aspects of motor behavior via subcortical projections. These early sensorimotor areas became modified in various ways as different branches of the mammalian radiation evolved, and neocortex often increased in size and the number of cortical areas, likely by the process of specializations within areas that subdivided areas. At least some areas, perhaps most, subdivided in another way by evolving two or more alternating types of small regions of different functional specializations, now referred to as cortical modules or columns. The specializations within and across cortical areas included those in the sizes of neurons and the extents of their processes, the dendrites and axons, and thus connections with other neurons. As a result, the neocortex of present-day mammals varies greatly within and across phylogenetically related groups (clades), while retaining basic features of organization from early ancestral mammals. In a number of present-day (extant) mammals, brains are relatively small and have little neocortex, with few areas and little structural differentiation, thus resembling early mammals. Other small mammals with little neocortex have specialized some part via selective enlargement and structural modifications to promote certain sensory abilities. Other mammals have a neocortex that is moderately to greatly expanded, with more cortical areas directly related to sensory processing and cognition and memory. The human brain is extreme in this way by having more neocortex in proportion to the rest of the brain, more cortical neurons, and likely more cortical areas.

Article

Tom Baden, Timm Schubert, Philipp Berens, and Thomas Euler

Visual processing begins in the retina—a thin, multilayered neuronal tissue lining the back of the vertebrate eye. The retina does not merely read out the constant stream of photons impinging on its dense array of photoreceptor cells. Instead it performs a first, extensive analysis of the visual scene, while constantly adapting its sensitivity range to the input statistics, such as the brightness or contrast distribution. The functional organization of the retina abides to several key organizational principles. These include overlapping and repeating instances of both divergence and convergence, constant and dynamic range-adjustments, and (perhaps most importantly) decomposition of image information into parallel channels. This is often referred to as “parallel processing.” To support this, the retina features a large diversity of neurons organized in functionally overlapping microcircuits that typically uniformly sample the retinal surface in a regular mosaic. Ultimately, each circuit drives spike trains in the retina’s output neurons, the retinal ganglion cells. Their axons form the optic nerve to convey multiple, distinctive, and often already heavily processed views of the world to higher visual centers in the brain. From an experimental point of view, the retina is a neuroscientist’s dream. While part of the central nervous system, the retina is largely self-contained, and depending on the species, it receives little feedback from downstream stages. This means that the tissue can be disconnected from the rest of the brain and studied in a dish for many hours without losing its functional integrity, all while retaining excellent experimental control over the exclusive natural network input: the visual stimulus. Once removed from the eyecup, the retina can be flattened, thus its neurons are easily accessed optically or using visually guided electrodes. Retinal tiling means that function studied at any one place can usually be considered representative for the entire tissue. At the same time, species-dependent specializations offer the opportunity to study circuits adapted to different visual tasks: for example, in case of our fovea, high-acuity vision. Taken together, today the retina is amongst the best understood complex neuronal tissues of the vertebrate brain.

Article

It is conceptually reasonable to explore how the evolution of behavior involves changes in neural circuitry. Progress in determining this evolutionary relationship has been limited in neuroscience because of difficulties in identifying individual neurons that contribute to the evolutionary development of behaviors across species. However, the results from the feeding systems of gastropod mollusks provide evidence for this concept of co-evolution because the evolution of different types of feeding behaviors in this diverse group of mollusks is mirrored by species-specific changes in neural circuitry. The evolution of feeding behaviors involves changes in the motor actions that allow diverse food items to be acquired and ingested. The evolution in neural control accompanies this variation in food and the associated changes in flexibility of feeding behaviors. This is present in components of the feeding network that are involved in decision making, rhythm generation, and behavioral switching but is absent in background mechanisms that are conserved across species, such as those controlling arousal state. These findings show how evolutionary changes, even at the single neuron level, closely reflect the details of behavioral evolution.

Article

Sensory systems exist to provide an organism with information about the state of the environment that can be used to guide future actions and decisions. Remarkably, two conceptually simple yet general theorems from information theory can be used to evaluate the performance of any sensory system. One theorem states that there is a minimal amount of energy that an organism has to spend in order to capture a given amount of information about the environment. The second theorem states that the maximum rate with which the organism can acquire resources from the environment, relative to its competitors, is limited by the information this organism collects about the environment, also relative to its competitors. These two theorems provide a scaffold for formulating and testing general principles of sensory coding but leave unanswered many important practical questions of implementation in neural circuits. These implementation questions have guided thinking in entire subfields of sensory neuroscience, and include: What features in the sensory environment should be measured? Given that we make decisions on a variety of time scales, how should one solve trade-offs between making simpler measurements to guide minimal decisions vs. more elaborate sensory systems that have to overcome multiple delays between sensation and action. Once we agree on the types of features that are important to represent, how should they be represented? How should resources be allocated between different stages of processing, and where is the impact of noise most damaging? Finally, one should consider trade-offs between implementing a fixed strategy vs. an adaptive scheme that readjusts resources based on current needs. Where adaptation is considered, under what conditions does it become optimal to switch strategies? Research over the past 60 years has provided answers to almost all of these questions but primarily in early sensory systems. Joining these answers into a comprehensive framework is a challenge that will help us understand who we are and how we can make better use of limited natural resources.

Article

Color is a central feature of human perceptual experience where it functions as a critical component in the detection, identification, evaluation, placement, and appreciation of objects in the visual world. Its role is significantly enhanced by the fact that humans evolved a dimension of color vision beyond that available to most other mammals. Many fellow primates followed a similar path and in recent years the basic mechanisms that support color vision—the opsin genes, photopigments, cone signals, and central processing—have been the subjects of hundreds of investigations. Because of the tight linkage between opsin gene structure and the spectral sensitivity of cone photopigments, it is possible to trace pathways along which color vision may have evolved in primates. In turn, such information allows the development of hypotheses about the nature of color vision and its utility in nonhuman primates. These hypotheses are being critically evaluated in field studies where primates solve visual problems in the presence of the full panoply of photic cues. The intent of this research is to determine which aspects of these cues are critically linked to color vision and how their presence facilitates, impedes, or fails to influence the solutions. These investigations are challenging undertakings and the emerging literature is replete with contradictory conclusions. But steady progress is being made and it appears that (a) some of the original ideas about there being a restricted number of tasks for which color vision might be optimally utilized by nonhuman primates (e. g., fruit harvest) were too simplistic and (b) depending on circumstances that can include both features of proximate visual stimuli (spectral cues, luminance cues, size cues, motion cues, overall light levels) and situational variables (social cues, developmental status, species-specific traits) the utilization of color vision by nonhuman primates is apt to be complex and varied.

Article

The brain has limited processing capacities. Attention selection processes are continuously shaping humans’ world perception. Understanding the mechanisms underlying such covert cognitive processes requires the combination of psychophysical and electrophysiological investigation methods. This combination allows researchers to describe how individual neurons and neuronal populations encode attentional function. Direct access to neuronal information through innovative electrophysiological approaches, additionally, allows the tracking of covert attention in real time. These converging approaches capture a comprehensive view of attentional function.

Article

Eliot A. Brenowitz

Animals produce communication signals to attract mates and deter rivals during their breeding season. The coincidence in timing results from the modulation of signaling behavior and neural activity by sex steroid hormones associated with reproduction. Adrenal steroids can influence signaling for aggressive interactions outside the breeding season. Androgenic and estrogenic hormones act on brain circuits that regulate the motivation to produce and respond to signals, the motor production of signals, and the sensory perception of signals. Signal perception, in turn, can stimulate gonadal development.

Article

Natalie Hempel de Ibarra and Misha Vorobyev

Color plays an important role in insect life—many insects forage on colorful flowers and/or have colorful bodies. Accordingly, most insects have multiple spectral types of photoreceptors in their eyes, which gives them the capability to see colors. However, insects cannot perceive colors in the same way as human beings do because their eyes and brains differ substantially. An insect was the first nonhuman animal whose ability to discriminate colors has been demonstrated - in the beginning of the 20th century, von Frisch showed that the honeybee, Apis mellifera, can discriminate blue from any shade of gray. This method, called the gray-card experiment, is an accepted “gold standard” for the proof of color vision in animals. Insect species differ in the combinations of photoreceptors in their eyes, with peak sensitivities in ultraviolet (UV) and/or blue, green, and sometimes red parts of the spectrum. The number of photoreceptor spectral types can be as little as one or two, as in the grasshopper Phlaeoba and the beetle Tribolium, and as many as 10 and more in some species of butterflies and dragonflies. However, not all spectral receptor types are necessarily used for color vison. For example, the butterfly Papilio xuthus uses only four of its eight photoreceptors for color vision. Some insects have separate channels for processing chromatic and achromatic (lightness) information. In the honeybee, the achromatic channel has high spatial resolution and is mediated only by long-wavelength sensitive, or “green,” photoreceptors alone, whereas the spatial resolution of chromatic vision is low and mediated by all three spectral types of photoreceptors. Whether other insects have a similar separation of chromatic and achromatic vision remains uncertain. In contrast to vertebrates, insects do not use distinct sets of photoreceptors for nocturnal vision, and some nocturnal insects can see color at night. Insect photoreceptors are inherently polarization sensitive because of their microvillar organization. Therefore, some insects cannot discriminate changes in polarization of light from changes in its spectral composition. However, many insects sacrifice polarization sensitivity to retain reliable color vision. For example, in the honeybee, polarization sensitivity is eliminated by twisting the rhabdom in most parts of its compound eye except for the dorsal rim area that is specialized in polarization vision. Insects experience color constancy and color-contrast phenomena. Although in humans these aspects of vision are often attributed to cortical processing of color, simple models based on photoreceptor adaptation may explain color constancy and color induction in insects. Color discriminations can be evaluated using a simple model, which assumes that it is limited by photoreceptor noise. This model can help to predict discrimination of colors that are ecologically relevant, such as flower colors for pollinating insects. However, despite the fact that many insects forage on flowers, there is no evidence that insect pollinator vision coevolved with flower colors. The diverse color vision in butterflies appears to adaptively facilitate the recognition of their wing colors.

Article

Navigation is the ability of animals to move through their environment in a planned manner. Different from directed but reflex-driven movements, it involves the comparison of the animal’s current heading with its intended heading (i.e., the goal direction). When the two angles don’t match, a compensatory steering movement must be initiated. This basic scenario can be described as an elementary navigational decision. Many elementary decisions chained together in specific ways form a coherent navigational strategy. With respect to navigational goals, there are four main forms of navigation: explorative navigation (exploring the environment for food, mates, shelter, etc.); homing (returning to a nest); straight-line orientation (getting away from a central place in a straight line); and long-distance migration (seasonal long-range movements to a location such as an overwintering place). The homing behavior of ants and bees has been examined in the most detail. These insects use several strategies to return to their nest after foraging, including path integration, route following, and, potentially, even exploit internal maps. Independent of the strategy used, insects can use global sensory information (e.g., skylight cues), local cues (e.g., visual panorama), and idiothetic (i.e., internal, self-generated) cues to obtain information about their current and intended headings. How are these processes controlled by the insect brain? While many unanswered questions remain, much progress has been made in recent years in understanding the neural basis of insect navigation. Neural pathways encoding polarized light information (a global navigational cue) target a brain region called the central complex, which is also involved in movement control and steering. Being thus placed at the interface of sensory information processing and motor control, this region has received much attention recently and emerged as the navigational “heart” of the insect brain. It houses an ordered array of head-direction cells that use a wide range of sensory information to encode the current heading of the animal. At the same time, it receives information about the movement speed of the animal and thus is suited to compute the home vector for path integration. With the help of neurons following highly stereotypical projection patterns, the central complex theoretically can perform the comparison of current and intended heading that underlies most navigation processes. Examining the detailed neural circuits responsible for head-direction coding, intended heading representation, and steering initiation in this brain area will likely lead to a solid understanding of the neural basis of insect navigation in the years to come.

Article

Synaptic connections in the brain can change their strength in response to patterned activity. This ability of synapses is defined as synaptic plasticity. Long lasting forms of synaptic plasticity, long-term potentiation (LTP), and long-term depression (LTD), are thought to mediate the storage of information about stimuli or features of stimuli in a neural circuit. Since its discovery in the early 1970s, synaptic plasticity became a central subject of neuroscience, and many studies centered on understanding its mechanisms, as well as its functional implications.

Article

Roswitha Wiltschko and Wolfgang Wiltschko

The magnetic field of the earth provides birds with navigational information, with birds having two different receptor systems, one for the direction, the other for the intensity of the geomagnetic field. The direction of the geomagnetic field is used as a compass, with the avian magnetic compass being an inclination compass not recording the polarity of the field. The respective directional information is perceived by light-dependent radical pair processes in the eyes, with cryptochrome, a photopigment with the chromophore flavin adenine dinucleotide as receptor molecule. It is transmitted by the optic nerve to the brain, where it is processed by parts of the visual system. The magnetic compass not only serves to orient avian flights but also acts as a reference system for route reversal, calibrating the astronomical compass systems, and in migratory birds, as reference for the innate information on the migratory direction. Magnetic intensity and inclination that show gradients from the poles to the magnetic equator are part of the mechanisms that allow birds to determine their position. Intensity is perceived by receptors based on magnetite, a permanently magnetic material. The effect of a brief, strong magnetic pulse and its duration indicates that superparamagnetic particles are involved. The respective information is transmitted by the ophthalmic branch of the trigeminal nerve to the trigeminal brainstem complex in the brain. Testing birds in magnetic fields of a distant site, i.e., magnetically simulating a displacement, documents that magnetic intensity and inclination are very most important components of the navigational “map” that enables birds to determine their position relative to the goal and thus derive the compass course leading to this goal. Furthermore, certain magnetic conditions act as signposts and elicit specific spontaneous responses.

Article

Many mammals, including humans, rely primarily on vision to sense the environment. While a large proportion of the brain is devoted to vision in highly visual animals, there are not enough neurons in the visual system to support a neuron-per-object look-up table. Instead, visual animals evolved ways to rapidly and dynamically encode an enormous diversity of visual information using minimal numbers of neurons (merely hundreds of millions of neurons and billions of connections!). In the mammalian visual system, a visual image is essentially broken down into simple elements that are reconstructed through a series of processing stages, most of which occur beneath consciousness. Importantly, visual information processing is not simply a serial progression along the hierarchy of visual brain structures (e.g., retina to visual thalamus to primary visual cortex to secondary visual cortex, etc.). Instead, connections within and between visual brain structures exist in all possible directions: feedforward, feedback, and lateral. Additionally, many mammalian visual systems are organized into parallel channels, presumably to enable efficient processing of information about different and important features in the visual environment (e.g., color, motion). The overall operations of the mammalian visual system are to: (1) combine unique groups of feature detectors in order to generate object representations and (2) integrate visual sensory information with cognitive and contextual information from the rest of the brain. Together, these operations enable individuals to perceive, plan, and act within their environment.

Article

Animals use their olfactory system for the procurement of food, the detection of danger, and the identification of potential mates. In vertebrates, the olfactory sensory neuron has a single apical dendrite that is exposed to the environment and a single basal axon that projects to the central nervous system (i.e., the olfactory bulb). The first odorant receptors to be discovered belong to an enormous gene family encoding G protein-coupled seven transmembrane domain proteins. Odorant binding to these classical odorant receptors initiates a GTP-dependent signaling cascade that uses cAMP as a second messenger. Subsequently, additional types of odorant receptors using different signaling pathways have been identified. While most olfactory sensory neurons are found in the olfactory sensory neuroepithelium, others are found in specialized olfactory subsystems. In rodents, the vomeronasal organ contains neurons that recognize pheromones, the septal organ recognizes odorant and mechanical stimuli, and the neurons of the Grüneberg ganglion are sensitive to cool temperatures and certain volatile alarm signals. Within the olfactory sensory neuroepithelium, each sensory neuron expresses a single odorant receptor gene out of the large gene family; the axons of sensory neurons expressing the same odorant receptor typically converge onto a pair of glomeruli at the periphery of the olfactory bulb. This results in the transformation of olfactory information into a spatially organized odortopic map in the olfactory bulb. The axons originating from the vomeronasal organ project to the accessory olfactory bulb, whereas the axons from neurons in the Grüneberg ganglion project to 10 specific glomeruli found in the caudal part of the olfactory bulb. Within a glomerulus, the axons originating from olfactory sensory neurons synapse on the dendrites of olfactory bulb neurons, including mitral and tufted cells. Mitral cells and tufted cells in turn project directly to higher brain centers (e.g., the piriform cortex and olfactory tubercle). The integration of olfactory information in the olfactory cortices and elsewhere in the central nervous system informs and directs animal behavior.

Article

Tyler S. Manning and Kenneth H. Britten

The ability to see motion is critical to survival in a dynamic world. Decades of physiological research have established that motion perception is a distinct sub-modality of vision supported by a network of specialized structures in the nervous system. These structures are arranged hierarchically according to the spatial scale of the calculations they perform, with more local operations preceding those that are more global. The different operations serve distinct purposes, from the interception of small moving objects to the calculation of self-motion from image motion spanning the entire visual field. Each cortical area in the hierarchy has an independent representation of visual motion. These representations, together with computational accounts of their roles, provide clues to the functions of each area. Comparisons between neural activity in these areas and psychophysical performance can identify which representations are sufficient to support motion perception. Experimental manipulation of this activity can also define which areas are necessary for motion-dependent behaviors like self-motion guidance.

Article

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.

Article

Asymmetry of bilateral visual and auditory sensors has functional advantages for depth visual perception and localization of auditory signals, respectively. In order to detect the spatial distribution of an odor, bilateral olfactory organs may compare side differences of odor intensity and timing by using a simultaneous sampling mechanism; alternatively, they may use a sequential sampling mechanism to compare spatial and temporal input detected by one or several chemosensors. Extensive research on strategies and mechanisms necessary for odor source localization has been focused mainly on invertebrates. Several recent studies in mammals such as moles, rodents, and humans suggest that there is an evolutionary advantage in using stereo olfaction for successful navigation towards an odor source. Smelling in stereo or a three-dimensional olfactory space may significantly reduce the time to locate an odor source; this quality provides instantaneous information for both foraging and predator avoidance. However, since mammals are capable of finding odor sources and tracking odor trails with one sensor side blocked, they may use an intriguing temporal mechanism to compare odor concentration from sniff to sniff. A particular focus of this article is attributed to differences between insects and mammals regarding the use of unilateral versus bilateral chemosensors for odor source localization.

Article

Sequences of actions and experiences are a central part of daily life in many species. Sequences consist of a set of ordered steps with a distinct beginning and end. They are defined by the serial order and relationships between items, though not necessarily by precise timing intervals. Sequences can be composed from a wide range of elements, including motor actions, perceptual experiences, memories, complex behaviors, or abstract goals. However, despite this variation, different types of sequences may share common features in neural coding. Examining the neural responses that support sequences is important not only for understanding the sequential behavior in daily life but also for investigating the array of diseases and disorders that impact sequential processes and the impact of therapeutics used to treat them. Research into the neural coding of sequences can be organized into the following broad categories: responses to ordinal position, coding of adjacency and inter-item relationships, boundary responses, and gestalt coding (representation of the sequence as a whole). These features of sequence coding have been linked to changes in firing rate patterns and neuronal oscillations across a range of cortical and subcortical brain areas and may be integrated in the lateral prefrontal cortex. Identification of these coding schemes has laid out an outline for understanding how sequences are represented at a neural level. Expanding from this work, future research faces fundamental questions about how these coding schemes are linked together to generate the complex range of sequential processes that influence cognition and behavior across animal species.

Article

Giuliano Gaeta, Regina M. Sullivan, and Donald A. Wilson

Odor- or chemical-guided behavior is expressed in all species. Such behavioral responses to odors begin with transduction at olfactory receptors and, after initial processing in early stages of the olfactory system (e.g., vertebrate olfactory bulb, invertebrate antennal lobe), the information is rapidly (within one to two synapses) distributed to diverse brain regions controlling hedonics, metabolic balance, mating, and spatial navigation, among many other basic functions. Odors can not only drive or guide specific behavioral responses but can also modulate behavioral choices and affective state, in many cases in humans without conscious awareness. Many of the specific neural circuits underlying odor-guided behaviors have been partially described, though much remains unknown. Neural processes underlying odor-guided reward and aversion, kin recognition, feeding, orientation, and navigation across diverse species have been discussed, as well as odor modulation of human behavior and emotion.

Article

Justin D. Lieber and Sliman J. Bensmaia

The ability to identify tactile objects depends in part on the perception of their surface microstructure and material properties. Texture perception can, on a first approximation, be described by a number of nameable perceptual axes, such as rough/smooth, hard/soft, sticky/slippery, and warm/cool, which exist within a complex perceptual space. The perception of texture relies on two different neural streams of information: Coarser features, measured in millimeters, are primarily encoded by spatial patterns of activity across one population of tactile nerve fibers, while finer features, down to the micron level, are encoded by finely timed temporal patterns within two other populations of afferents. These two streams of information ascend the somatosensory neuraxis and are eventually combined and further elaborated in the cortex to yield a high-dimensional representation that accounts for our exquisite and stable perception of texture.

Article

Adam Hockley and Susan E. Shore

Tinnitus is the perception of sound that is independent from an external stimulus. Despite the word tinnitus being derived from the Latin verb for ring, tinnire, it can present as buzzing, hissing, or clicking. Tinnitus is generated centrally in the auditory pathway; however, the neural mechanisms underlying this generation have been disputed for decades. Although it is well accepted that tinnitus is produced by damage to the auditory system by exposure to loud sounds, the level of damage required and how this damage results in tinnitus are unclear. Neural recordings in the auditory brainstem, midbrain, and forebrain of animals with models of tinnitus have revealed increased spontaneous firing rates, capable of being perceived as a sound. There are many proposed mechanisms of how this increase is produced, including spike-timing-dependent plasticity, homeostatic plasticity, central gain, reduced inhibition, thalamocortical dysrhythmia, and increased inflammation. Animal studies are highly useful for testing these potential mechanisms because the noise damage can be carefully titrated and recordings can be made directly from neural populations of interest. These studies have advanced the field greatly; however, the limitations are that the variety of models for tinnitus induction and quantification are not well standardized, which may explain some of the variability seen across studies. Human studies use patients with tinnitus (but an unknown level of cochlear damage) to probe neural mechanisms of tinnitus. They use noninvasive methods, often recoding gross evoked potentials, oscillations, or imaging brain activity to determine if tinnitus sufferers show altered processing of sounds or silence. These studies have also revealed putative neural mechanisms of tinnitus, such as increased delta- or gamma-band cortical activity, altered Bayesian prediction of incoming sound, and changes to limbic system activity. Translation between animal and human studies has allowed some neural correlates of tinnitus to become more widely accepted, which has in turn allowed deeper research into the underlying mechanism of the correlates. As the understanding of neural mechanisms of tinnitus grows, the potential for treatments is also improved, with the ultimate goal being a true treatment for tinnitus perception.