1-2 of 2 Results

  • Keywords: spatial x
Clear all

Article

Justin D. Lieber and Sliman J. Bensmaia

The ability to identify tactile objects depends in part on the perception of their surface microstructure and material properties. Texture perception can, on a first approximation, be described by a number of nameable perceptual axes, such as rough/smooth, hard/soft, sticky/slippery, and warm/cool, which exist within a complex perceptual space. The perception of texture relies on two different neural streams of information: Coarser features, measured in millimeters, are primarily encoded by spatial patterns of activity across one population of tactile nerve fibers, while finer features, down to the micron level, are encoded by finely timed temporal patterns within two other populations of afferents. These two streams of information ascend the somatosensory neuraxis and are eventually combined and further elaborated in the cortex to yield a high-dimensional representation that accounts for our exquisite and stable perception of texture.

Article

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.