Sign phonetics is the study of how sign languages are produced and perceived, by native as well as by non-native signers. Most research on sign phonetics has focused on American Sign Language (ASL), but there are many different sign languages around the world, and several of these, including British Sign Language, Taiwan Sign Language, and Sign Language of the Netherlands, have been studied at the level of phonetics. Sign phonetics research can focus on individual lexical signs or on the movements of the nonmanual articulators that accompany those signs. The production and perception of a sign language can be influenced by phrase structure, linguistic register, the signer’s linguistic background, the visual perception mechanism, the anatomy and physiology of the hands and arms, and many other factors. What sets sign phonetics apart from the phonetics of spoken languages is that the two language modalities use different mechanisms of production and perception, which could in turn result in structural differences between modalities. Most studies of sign phonetics have been based on careful analyses of video data. Some studies have collected kinematic limb movement data during signing and carried out quantitative analyses of sign production related to, for example, signing rate, phonetic environment, or phrase position. Similarly, studies of sign perception have recorded participants’ ability to identify and discriminate signs, depending, for example, on slight variations in the signs’ forms or differences in the participants’ language background. Most sign phonetics research is quantitative and lab-based.
D. H. Whalen
Phonetics is the branch of linguistics that deals with the physical realization of meaningful distinctions in spoken language. Phoneticians study the anatomy and physics of sound generation, acoustic properties of the sounds of the world’s languages, the features of the signal that listeners use to perceive the message, and the brain mechanisms involved in both production and perception. Therefore, phonetics connects most directly to phonology and psycholinguistics, but it also engages a range of disciplines that are not unique to linguistics, including acoustics, physiology, biomechanics, hearing, evolution, and many others. Early theorists assumed that phonetic implementation of phonological features was universal, but it has become clear that languages differ in their phonetic spaces for phonological elements, with systematic differences in acoustics and articulation. Such language-specific details place phonetics solidly in the domain of linguistics; any complete description of a language must include its specific phonetic realization patterns. The description of what phonetic realizations are possible in human language continues to expand as more languages are described; many of the under-documented languages are endangered, lending urgency to the phonetic study of the world’s languages. Phonetic analysis can consist of transcription, acoustic analysis, measurement of speech articulators, and perceptual tests, with recent advances in brain imaging adding detail at the level of neural control and processing. Because of its dual nature as a component of a linguistic system and a set of actions in the physical world, phonetics has connections to many other branches of linguistics, including not only phonology but syntax, semantics, sociolinguistics, and clinical linguistics as well. Speech perception has been shown to integrate information from both vision and tactile sensation, indicating an embodied system. Sign language, though primarily visual, has adopted the term “phonetics” to represent the realization component, highlighting the linguistic nature both of phonetics and of sign language. Such diversity offers many avenues for studying phonetics, but it presents challenges to forming a comprehensive account of any language’s phonetic system.