Textual studies describes a range of fields and methodologies that evaluate how texts are constituted both physically and conceptually, document how they are preserved, copied, and circulated, and propose ways in which they might be edited to minimize error and maximize the text’s integrity. The vast temporal reach of the history of textuality—from oral traditions spanning thousands of years and written forms dating from the 4th millenium bce to printed and digital text forms—is matched by its geographical range covering every linguistic community around the globe. Methods of evaluating material text-bearing documents and the reliability of their written or printed content stem from antiquity, often paying closest attention to sacred texts as well as to legal documents and literary works that helped form linguistic and social group identity. With the incarnation of the printing press in the early modern West, the rapid reproduction of text matter in large quantities had the effect of corrupting many texts with printing errors as well as providing the technical means of correcting such errors more cheaply and quickly than in the preceding scribal culture. From the 18th century, techniques of textual criticism were developed to attempt systematic correction of textual error, again with an emphasis on scriptural and classical texts. This “golden age of philology” slowly widened its range to consider such foundational medieval texts as Dante’s Commedia as well as, in time, modern vernacular literature. The technique of stemmatic analysis—the establishment of family relationships between existing documents of a text—provided the means for scholars to choose between copies of a work in the pursuit of accuracy. In the absence of original documents (manuscripts in the hand of Aristotle or the four Evangelists, for example) the choice between existing versions of a text were often made eclectically—that is, drawing on multiple versions—and thus were subject to such considerations as the historic range and geographical diffusion of documents, the systematic identification of common scribal errors, and matters of translation. As the study of modern languages and literatures consolidated into modern university departments in the later 19th century, new techniques emerged with the aim of providing reliable literary texts free from obvious error. This aim had in common with the preceding philological tradition the belief that what a text means—discovered in the practice of hermeneutics—was contingent on what the text states—established by an accurate textual record that eliminates error by means of textual criticism. The methods of textual criticism took several paths through the 20th century: the Anglophone tradition centered on editing Shakespeare’s works by drawing on the earliest available documents—the printed Quartos and Folios—developing into the Greg–Bowers–Tanselle copy-text “tradition” which was then deployed as a method by which to edit later texts. The status of variants in modern literary works with multiple authorial manuscripts—not to mention the existence of competing versions of several of Shakespeare’s plays—complicated matters sufficiently that editors looked to alternate editorial models. Genetic editorial methods draw in part on German editorial techniques, collating all existing manuscripts and printed texts of a work in order to provide a record of its composition process, including epigenetic processes following publication. The French methods of critique génétique also place the documentary record at the center, where the dossier is given priority over any one printed edition, and poststructuralist theory is used to examine the process of “textual invention.” The inherently social aspects of textual production—the author’s interaction with agents, censors, publishers, and printers and the way these interactions shape the content and presentation of the text—have reconceived how textual authority and variation are understood in the social and economic contexts of publication. And, finally, the advent of digital publication platforms has given rise to new developments in the presentation of textual editions and manuscript documents, displacing copy-text editing in some fields such as modernism studies in favor of genetic or synoptic models of composition and textual production.
Literacy is a measure of being literate, of the ability to read and write. The central activity of the humanities—its shared discipline—literacy has become one of its most powerful and diffuse metaphors, becoming a broadly applied metaphor representing a fluency, a competency, or a skill in manipulating information. The word “literacy” is of recent coinage, being little more than a century old. Reading and writing, or effectively using letters (the word at the root of literacy), are ancient skills, but the word “literacy” likely springs from and reflects the emergence of mass public education at the end of the 19th and the turn of the 20th century. In this sense, then “literacy” measures personal and demographic development. Literacy is mimetic. It is synesthetic—in some languages, it means hearing sounds (the phonemes) in what is seen (the letters); in others, it means linking a symbol to the thing symbolized. Although a recent word, “literacy” depends upon the emergence of symbolic sign systems in ancient times. Written symbolic systems, by contrast, are relatively recent developments in human history. But they bear a more complicated relationship to the spoken language, being in part a representation of it (and thus a recording of its contents) while also offering a representation of the world, the referent: that is, literacy involves an awareness of the representation of the world. Reading and writing are tied to millennia of changes in technologies of representation. As a term denoting fluidity with letters, literacy has a history and a geography that follow the development and movement of a phonetic alphabetic and subsequent systems of writing. If the alphabet encodes a shift from orality to literacy, HTML encodes a shift from verbal literacy to a kind of numerical literacy not yet theorized.