Sign phonetics is the study of how sign languages are produced and perceived, by native as well as by non-native signers. Most research on sign phonetics has focused on American Sign Language (ASL), but there are many different sign languages around the world, and several of these, including British Sign Language, Taiwan Sign Language, and Sign Language of the Netherlands, have been studied at the level of phonetics. Sign phonetics research can focus on individual lexical signs or on the movements of the nonmanual articulators that accompany those signs. The production and perception of a sign language can be influenced by phrase structure, linguistic register, the signer’s linguistic background, the visual perception mechanism, the anatomy and physiology of the hands and arms, and many other factors. What sets sign phonetics apart from the phonetics of spoken languages is that the two language modalities use different mechanisms of production and perception, which could in turn result in structural differences between modalities. Most studies of sign phonetics have been based on careful analyses of video data. Some studies have collected kinematic limb movement data during signing and carried out quantitative analyses of sign production related to, for example, signing rate, phonetic environment, or phrase position. Similarly, studies of sign perception have recorded participants’ ability to identify and discriminate signs, depending, for example, on slight variations in the signs’ forms or differences in the participants’ language background. Most sign phonetics research is quantitative and lab-based.