1-4 of 4 Results  for:

  • Literary Theory x
  • 19th Century (1800-1900) x
  • British and Irish Literatures x
  • Print Culture and Digital Humanities x
Clear all

Article

The Chapter  

Nicholas Dames

First known as a kephalaion in Greek, capitulum or caput in Latin, the chapter arose in antiquity as a finding device within long, often heterogenous prose texts, prior even to the advent of the codex. By the 4th century ce, it was no longer unusual for texts to be composed in capitula; but it is with the advent of the fictional prose narratives we call the novel that the chapter, both ubiquitous and innocuous, developed into a compositional practice with a distinct way of thinking about biographical time. A technique of discontinuous reading or “consultative access” which finds a home in a form for continuous, immersive reading, the chapter is a case study in adaptive reuse and slow change. One of the primary ways the chapter became a narrative form rather than just an editorial practice is through the long history of the chaptering of the Bible, particularly the various systems for chaptering the New Testament, which culminated in the early 13th century formation of the biblical chaptering system still in use across the West. Biblical chapters formed a template for how to segment ongoing plots or actions which was taken up by writers, printers, and editors from the late medieval period onward; pivotal examples include William Caxton’s chaptering of Thomas Malory’s Morte d’Arthur in his 1485 printing of the text, or the several mises en proses of Chrétien de Troyes’s poems carried out in the Burgundian court circle of the 15th century. By the 18th century, a vibrant set of discussions, controversies, and experiments with chapters were characteristic of the novel form, which increasingly used chapter titles and chapter breaks to meditate upon how different temporal units understand human agency in different ways. With the eventual dominance of the novel in 19th-century literary culture, the chapter had been honed into a way of thinking about the segmented nature of biographical memory, as well as the temporal frames—the day, the year, the episode or epoch—in which that segmenting occurs; chapters in this period were of an increasingly standard size, although still lacking any formal rules or definition. Modernist prose narratives often played with the chapter form, expanding it or drastically shortening it, but these experiments usually tended to reaffirm the unit of the chapter as a significant measure by which we make sense of human experience.

Article

Close Reading  

Mark Byron

Close reading describes a set of procedures and methods that distinguishes the scholarly apprehension of textual material from the more prosaic reading practices of everyday life. Its origins and ancestry are rooted in the exegetical traditions of sacred texts (principally from the Hindu, Jewish, Buddhist, Christian, Zoroastrian, and Islamic traditions) as well as the philological strategies applied to classical works such as the Homeric epics in the Greco-Roman tradition, or the Chinese 詩經 (Shijing) or Classic of Poetry. Cognate traditions of exegesis and commentary formed around Roman law and the canon law of the Christian Church, and they also find expression in the long tradition of Chinese historical commentaries and exegeses on the Five Classics and Four Books. As these practices developed in the West, they were adapted to medieval and early modern literary texts from which the early manifestations of modern secular literary analysis came into being in European and American universities. Close reading comprises the methodologies at the center of literary scholarship as it developed in the modern academy over the past one hundred years or so, and has come to define a central set of practices that dominated scholarly work in English departments until the turn to literary and critical theory in the late 1960s. This article provides an overview of these dominant forms of close reading in the modern Western academy. The focus rests upon close reading practices and their codification in English departments, although reference is made to non-Western reading practices and philological traditions, as well as to significant nonanglophone alternatives to the common understanding of literary close reading.

Article

Modern Manuscripts  

Dirk Van Hulle

The study of modern manuscripts to examine writing processes is termed “genetic criticism.” A current trend that is sometimes overdramatized as “the archival turn” is a result of renewed interest in this discipline, which has a long tradition situated at the intersection between modern book history, bibliography, textual criticism, and scholarly editing. Handwritten documents are called “modern” manuscripts to distinguish them from medieval or even older manuscripts. Whereas most extant medieval manuscripts are scribal copies and fit into a context of textual circulation and dissemination, modern manuscripts are usually autographs for private use. Traditionally, the watershed between older and “modern” manuscripts is situated around the middle of the 18th century, coinciding with the rise of the so-called Geniezeit, the Sturm und Drang (Storm and Stress) period in which the notion of “genius” became fashionable. Authors such as Goethe carefully preserved their manuscripts. This new interest in authors’ manuscripts can be part of the “genius” ideology: since a draft was regarded as the trace of a thought process, a manuscript was the tangible evidence of capital-G “Genius” at work. But this division between modern and older manuscripts needs to be nuanced, for there are of course autograph manuscripts with cancellations and revisions from earlier periods, which are equally interesting for manuscript research. Genetic criticism studies the dynamics of creative processes, discerning a difference between the part of the genesis that takes place in the author’s private environment and the continuation of that genesis after the work has become public. But the genesis is often not a linear development “before” and “after” publication; rather, it can be conceptualized by means of a triangular model. The three corners of that model are endogenesis (the “inside” of a writing process, the writing of drafts), exogenesis (the relation to external sources of inspiration), and epigenesis (the continuation of the genesis and revision after publication). At any point in the genesis there is the possibility that exogenetic material may color the endo- or the epigenesis. In the digital age, archival literary documents are no longer coterminous with a material object. But that does not mean the end of genetic criticism. On the contrary, an exciting future lies ahead. Born-digital works require new methods of analysis, including digital forensics, computer-assisted collation, and new forms of distant reading. The challenge is to connect to methods of digital text analysis by finding ways to enable macroanalysis across versions.

Article

Textual Studies  

Mark Byron

Textual studies describes a range of fields and methodologies that evaluate how texts are constituted both physically and conceptually, document how they are preserved, copied, and circulated, and propose ways in which they might be edited to minimize error and maximize the text’s integrity. The vast temporal reach of the history of textuality—from oral traditions spanning thousands of years and written forms dating from the 4th millenium bce to printed and digital text forms—is matched by its geographical range covering every linguistic community around the globe. Methods of evaluating material text-bearing documents and the reliability of their written or printed content stem from antiquity, often paying closest attention to sacred texts as well as to legal documents and literary works that helped form linguistic and social group identity. With the incarnation of the printing press in the early modern West, the rapid reproduction of text matter in large quantities had the effect of corrupting many texts with printing errors as well as providing the technical means of correcting such errors more cheaply and quickly than in the preceding scribal culture. From the 18th century, techniques of textual criticism were developed to attempt systematic correction of textual error, again with an emphasis on scriptural and classical texts. This “golden age of philology” slowly widened its range to consider such foundational medieval texts as Dante’s Commedia as well as, in time, modern vernacular literature. The technique of stemmatic analysis—the establishment of family relationships between existing documents of a text—provided the means for scholars to choose between copies of a work in the pursuit of accuracy. In the absence of original documents (manuscripts in the hand of Aristotle or the four Evangelists, for example) the choice between existing versions of a text were often made eclectically—that is, drawing on multiple versions—and thus were subject to such considerations as the historic range and geographical diffusion of documents, the systematic identification of common scribal errors, and matters of translation. As the study of modern languages and literatures consolidated into modern university departments in the later 19th century, new techniques emerged with the aim of providing reliable literary texts free from obvious error. This aim had in common with the preceding philological tradition the belief that what a text means—discovered in the practice of hermeneutics—was contingent on what the text states—established by an accurate textual record that eliminates error by means of textual criticism. The methods of textual criticism took several paths through the 20th century: the Anglophone tradition centered on editing Shakespeare’s works by drawing on the earliest available documents—the printed Quartos and Folios—developing into the Greg–Bowers–Tanselle copy-text “tradition” which was then deployed as a method by which to edit later texts. The status of variants in modern literary works with multiple authorial manuscripts—not to mention the existence of competing versions of several of Shakespeare’s plays—complicated matters sufficiently that editors looked to alternate editorial models. Genetic editorial methods draw in part on German editorial techniques, collating all existing manuscripts and printed texts of a work in order to provide a record of its composition process, including epigenetic processes following publication. The French methods of critique génétique also place the documentary record at the center, where the dossier is given priority over any one printed edition, and poststructuralist theory is used to examine the process of “textual invention.” The inherently social aspects of textual production—the author’s interaction with agents, censors, publishers, and printers and the way these interactions shape the content and presentation of the text—have reconceived how textual authority and variation are understood in the social and economic contexts of publication. And, finally, the advent of digital publication platforms has given rise to new developments in the presentation of textual editions and manuscript documents, displacing copy-text editing in some fields such as modernism studies in favor of genetic or synoptic models of composition and textual production.