Thad E. Wilson and Kristen Metzler-Wilson
Thermoregulation is a key physiologic homeostatic process and is subdivided into autonomic, behavioral, and adaptive divisions. Autonomic thermoregulation is a neural process related to the sympathetic and parasympathetic nervous systems. Autonomic thermoregulation is controlled at the subcortical level to alter physiologic processes of heat production and loss to maintain internal temperature. Mammalian, including human, autonomic responses to acute heat or cold stresses are dependent on environmental conditions and species genotype and phenotype, but many similarities exist. Responses to an acute heat stress begin with the sensation of heat, leading to central processing of the information and sympathetic responses via end organs, which can include sweat glands, vasculature, and airway and cardiac tissues. Responses to an acute cold stress begin with the sensation of cold, which leads to central processing of the information and sympathetic responses via end organs, which can include skeletal and piloerector muscles, brown adipose tissue, vasculature, and cardiac tissue. These autonomic responses allow homeostasis of internal temperature to be maintained across a wide range of external temperatures for most mammals, including humans. At times, uncompensable thermal challenges occur that can be maintained for only limited periods of time before leading to pathophysiologic states of hyperthermia or hypothermia.
Navigation is the ability of animals to move through their environment in a planned manner. Different from directed but reflex-driven movements, it involves the comparison of the animal’s current heading with its intended heading (i.e., the goal direction). When the two angles don’t match, a compensatory steering movement must be initiated. This basic scenario can be described as an elementary navigational decision. Many elementary decisions chained together in specific ways form a coherent navigational strategy. With respect to navigational goals, there are four main forms of navigation: explorative navigation (exploring the environment for food, mates, shelter, etc.); homing (returning to a nest); straight-line orientation (getting away from a central place in a straight line); and long-distance migration (seasonal long-range movements to a location such as an overwintering place). The homing behavior of ants and bees has been examined in the most detail. These insects use several strategies to return to their nest after foraging, including path integration, route following, and, potentially, even exploit internal maps. Independent of the strategy used, insects can use global sensory information (e.g., skylight cues), local cues (e.g., visual panorama), and idiothetic (i.e., internal, self-generated) cues to obtain information about their current and intended headings.
How are these processes controlled by the insect brain? While many unanswered questions remain, much progress has been made in recent years in understanding the neural basis of insect navigation. Neural pathways encoding polarized light information (a global navigational cue) target a brain region called the central complex, which is also involved in movement control and steering. Being thus placed at the interface of sensory information processing and motor control, this region has received much attention recently and emerged as the navigational “heart” of the insect brain. It houses an ordered array of head-direction cells that use a wide range of sensory information to encode the current heading of the animal. At the same time, it receives information about the movement speed of the animal and thus is suited to compute the home vector for path integration. With the help of neurons following highly stereotypical projection patterns, the central complex theoretically can perform the comparison of current and intended heading that underlies most navigation processes. Examining the detailed neural circuits responsible for head-direction coding, intended heading representation, and steering initiation in this brain area will likely lead to a solid understanding of the neural basis of insect navigation in the years to come.
Mindaugas Mitkus, Simon Potier, Graham R. Martin, Olivier Duriez, and Almut Kelber
Diurnal raptors (birds of the orders Accipitriformes and Falconiformes), renowned for their extraordinarily sharp eyesight, have fascinated humans for centuries. The high visual acuity in some raptor species is possible due to their large eyes, both in relative and absolute terms, and a high density of cone photoreceptors. Some large raptors, such as wedge-tailed eagles and the Old World vultures, have visual acuities twice as high as humans and six times as high as ostriches—the animals with the largest terrestrial eyes. The raptor retina has rods, double cones, and four spectral types of single cones. The highest density of single cones occurs in one or two specialized retinal regions: the foveae, where, at least in some species, rods and double cones are absent. The deep central fovea allows for the highest acuity in the lateral visual field that is probably used for detecting prey from a large distance. Pursuit-hunting raptors have a second, shallower, temporal fovea that allows for sharp vision in the frontal field of view. Scavenging carrion eaters do not possess a temporal fovea that may indicate different needs in foraging behavior. Moreover, pursuit-hunting and scavenging raptors also differ in configuration of visual fields, with a more extensive field of view in scavengers.
The eyes of diurnal raptors, unlike those of most other birds, are not very sensitive to ultraviolet light, which is strongly absorbed by their cornea and lens. As a result of the low density of rods, and the narrow and densely packed single cones in the central fovea, the visual performance of diurnal raptors drops dramatically as light levels decrease. These and other visual properties underpin prey detection and pursuit and show how these birds’ vision is adapted to make them successful diurnal predators.
Tom Baden, Timm Schubert, Philipp Berens, and Thomas Euler
Visual processing begins in the retina—a thin, multilayered neuronal tissue lining the back of the vertebrate eye. The retina does not merely read out the constant stream of photons impinging on its dense array of photoreceptor cells. Instead it performs a first, extensive analysis of the visual scene, while constantly adapting its sensitivity range to the input statistics, such as the brightness or contrast distribution. The functional organization of the retina abides to several key organizational principles. These include overlapping and repeating instances of both divergence and convergence, constant and dynamic range-adjustments, and (perhaps most importantly) decomposition of image information into parallel channels. This is often referred to as “parallel processing.” To support this, the retina features a large diversity of neurons organized in functionally overlapping microcircuits that typically uniformly sample the retinal surface in a regular mosaic. Ultimately, each circuit drives spike trains in the retina’s output neurons, the retinal ganglion cells. Their axons form the optic nerve to convey multiple, distinctive, and often already heavily processed views of the world to higher visual centers in the brain.
From an experimental point of view, the retina is a neuroscientist’s dream. While part of the central nervous system, the retina is largely self-contained, and depending on the species, it receives little feedback from downstream stages. This means that the tissue can be disconnected from the rest of the brain and studied in a dish for many hours without losing its functional integrity, all while retaining excellent experimental control over the exclusive natural network input: the visual stimulus. Once removed from the eyecup, the retina can be flattened, thus its neurons are easily accessed optically or using visually guided electrodes. Retinal tiling means that function studied at any one place can usually be considered representative for the entire tissue. At the same time, species-dependent specializations offer the opportunity to study circuits adapted to different visual tasks: for example, in case of our fovea, high-acuity vision. Taken together, today the retina is amongst the best understood complex neuronal tissues of the vertebrate brain.
Cynthia M. Harley and Mark K. Asplen
Annelid worms are simultaneously an interesting and difficult model system for understanding the evolution of animal vision. On the one hand, a wide variety of photoreceptor cells and eye morphologies are exhibited within a single phylum; on the other, annelid phylogenetics has been substantially re-envisioned within the last decade, suggesting the possibility of considerable convergent evolution. This article reviews the comparative anatomy of annelid visual systems within the context of the specific behaviors exhibited by these animals. Each of the major classes of annelid visual systems is examined, including both simple photoreceptor cells (including leech body eyes) and photoreceptive cells with pigment (trochophore larval eyes, ocellar tubes, complex eyes); meanwhile, behaviors examined include differential mobility and feeding strategies, similarities (or differences) in larval versus adult visual behaviors within a species, visual signaling, and depth sensing. Based on our review, several major trends in the comparative morphology and ethology of annelid vision are highlighted: (1) eye complexity tends to increase with mobility and higher-order predatory behavior; (2) although they have simple sensors these can relay complex information through large numbers or multimodality; (3) polychaete larval and adult eye morphology can differ strongly in many mobile species, but not in many sedentary species; and (4) annelids exhibiting visual signaling possess even more complex visual systems than expected, suggesting the possibility that complex eyes can be simultaneously well adapted to multiple visual tasks.
Synaptic connections in the brain can change their strength in response to patterned activity. This ability of synapses is defined as synaptic plasticity. Long lasting forms of synaptic plasticity, long-term potentiation (LTP), and long-term depression (LTD), are thought to mediate the storage of information about stimuli or features of stimuli in a neural circuit. Since its discovery in the early 1970s, synaptic plasticity became a central subject of neuroscience, and many studies centered on understanding its mechanisms, as well as its functional implications.
Cynthia F. Moss
Echolocating bats have evolved an active sensing system, which supports 3D perception of objects in the surroundings and permits spatial navigation in complete darkness. Echolocating animals produce high frequency sounds and use the arrival time, intensity, and frequency content of echo returns to determine the distance, direction, and features of objects in the environment. Over 1,000 species of bats echolocate with signals produced in their larynges. They use diverse sonar signal designs, operate in habitats ranging from tropical rain forest to desert, and forage for different foods, including insects, fruit, nectar, small vertebrates, and even blood. Specializations of the mammalian auditory system, coupled with high frequency hearing, enable spatial imaging by echolocation in bats. Specifically, populations of neurons in the bat central nervous system respond selectively to the direction and delay of sonar echoes. In addition, premotor neurons in the bat brain are implicated in the production of sonar calls, along with movement of the head and ears. Audio-motor circuits, within and across brain regions, lay the neural foundation for acoustic orientation by echolocation in bats.
Quentin Gaudry and Jonathan Schenk
Olfactory systems are tasked with converting the chemical environment into electrical signals that the brain can use to optimize behaviors such as navigating towards resources, finding mates, or avoiding danger. Drosophila melanogaster has long served as a model system for several attributes of olfaction. Such features include sensory coding, development, and the attempt to link sensory perception to behavior. The strength of Drosophila as a model system for neurobiology lies in the myriad of genetic tools made available to the experimentalist, and equally importantly, the numerical reduction in cell numbers within the olfactory circuit. Modern techniques have recently made it possible to target nearly all cell types in the antennal lobe to directly monitor their physiological activity or to alter their expression of endogenous proteins or transgenes.
Tatyana O. Sharpee
Sensory systems exist to provide an organism with information about the state of the environment that can be used to guide future actions and decisions. Remarkably, two conceptually simple yet general theorems from information theory can be used to evaluate the performance of any sensory system. One theorem states that there is a minimal amount of energy that an organism has to spend in order to capture a given amount of information about the environment. The second theorem states that the maximum rate with which the organism can acquire resources from the environment, relative to its competitors, is limited by the information this organism collects about the environment, also relative to its competitors.
These two theorems provide a scaffold for formulating and testing general principles of sensory coding but leave unanswered many important practical questions of implementation in neural circuits. These implementation questions have guided thinking in entire subfields of sensory neuroscience, and include: What features in the sensory environment should be measured? Given that we make decisions on a variety of time scales, how should one solve trade-offs between making simpler measurements to guide minimal decisions vs. more elaborate sensory systems that have to overcome multiple delays between sensation and action. Once we agree on the types of features that are important to represent, how should they be represented? How should resources be allocated between different stages of processing, and where is the impact of noise most damaging? Finally, one should consider trade-offs between implementing a fixed strategy vs. an adaptive scheme that readjusts resources based on current needs. Where adaptation is considered, under what conditions does it become optimal to switch strategies? Research over the past 60 years has provided answers to almost all of these questions but primarily in early sensory systems. Joining these answers into a comprehensive framework is a challenge that will help us understand who we are and how we can make better use of limited natural resources.
Kathleen E. Cullen
As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.