First known as a kephalaion in Greek, capitulum or caput in Latin, the chapter arose in antiquity as a finding device within long, often heterogenous prose texts, prior even to the advent of the codex. By the 4th century ce, it was no longer unusual for texts to be composed in capitula; but it is with the advent of the fictional prose narratives we call the novel that the chapter, both ubiquitous and innocuous, developed into a compositional practice with a distinct way of thinking about biographical time. A technique of discontinuous reading or “consultative access” which finds a home in a form for continuous, immersive reading, the chapter is a case study in adaptive reuse and slow change. One of the primary ways the chapter became a narrative form rather than just an editorial practice is through the long history of the chaptering of the Bible, particularly the various systems for chaptering the New Testament, which culminated in the early 13th century formation of the biblical chaptering system still in use across the West. Biblical chapters formed a template for how to segment ongoing plots or actions which was taken up by writers, printers, and editors from the late medieval period onward; pivotal examples include William Caxton’s chaptering of Thomas Malory’s Morte d’Arthur in his 1485 printing of the text, or the several mises en proses of Chrétien de Troyes’s poems carried out in the Burgundian court circle of the 15th century. By the 18th century, a vibrant set of discussions, controversies, and experiments with chapters were characteristic of the novel form, which increasingly used chapter titles and chapter breaks to meditate upon how different temporal units understand human agency in different ways. With the eventual dominance of the novel in 19th-century literary culture, the chapter had been honed into a way of thinking about the segmented nature of biographical memory, as well as the temporal frames—the day, the year, the episode or epoch—in which that segmenting occurs; chapters in this period were of an increasingly standard size, although still lacking any formal rules or definition. Modernist prose narratives often played with the chapter form, expanding it or drastically shortening it, but these experiments usually tended to reaffirm the unit of the chapter as a significant measure by which we make sense of human experience.
Article
The Chapter
Nicholas Dames
Article
Close Reading
Mark Byron
Close reading describes a set of procedures and methods that distinguishes the scholarly apprehension of textual material from the more prosaic reading practices of everyday life. Its origins and ancestry are rooted in the exegetical traditions of sacred texts (principally from the Hindu, Jewish, Buddhist, Christian, Zoroastrian, and Islamic traditions) as well as the philological strategies applied to classical works such as the Homeric epics in the Greco-Roman tradition, or the Chinese 詩經 (Shijing) or Classic of Poetry. Cognate traditions of exegesis and commentary formed around Roman law and the canon law of the Christian Church, and they also find expression in the long tradition of Chinese historical commentaries and exegeses on the Five Classics and Four Books. As these practices developed in the West, they were adapted to medieval and early modern literary texts from which the early manifestations of modern secular literary analysis came into being in European and American universities. Close reading comprises the methodologies at the center of literary scholarship as it developed in the modern academy over the past one hundred years or so, and has come to define a central set of practices that dominated scholarly work in English departments until the turn to literary and critical theory in the late 1960s. This article provides an overview of these dominant forms of close reading in the modern Western academy. The focus rests upon close reading practices and their codification in English departments, although reference is made to non-Western reading practices and philological traditions, as well as to significant nonanglophone alternatives to the common understanding of literary close reading.
Article
Critique
Charlie Blake
From its emergence and early evolution in and through the writings of Immanuel Kant, Ludwig Feuerbach, and Karl Marx, critique established its parameters very early on as both porous and dynamic. Critique has always been, in this sense, mutable, directed, and both multidisciplinary and transdisciplinary, and this very fluidity and flexibility of its processes are possibly among the central reasons for its continuous relevance even when it has been dismantled, rebuffed, and attacked for embodying traits, from gender bias to Eurocentrism to neuro-normativity, that seem to indicate the very opposite of that flexibility. Indeed, once it is examined closely as an apparatus, the mechanism of critique will invariably reveal itself as having always contained the tools for its own opposition and even the tools for its own destruction. Critique has in this way always implied both its generality as a form and autocritique as an essential part of its process. For the past two centuries this general, self-reflective, and self-dismantling quality has led to its constant reinvention and re-adaptation by a wide range of thinkers and writers and across a broad range of disciplines.
In the case of literature and literary theory, its role can often best be grasped as that of a meta-discourse in which the nature and purpose of literary criticism is shadowed, reflected upon, and performed. From this perspective, from the 18th-century origins of critique in its gestation in the fields of theology and literary criticism to its formalization by Kant, the literary expression of critique has always been bound up with debates over the function of literary texts, their history, their production, their consumption, and their critical evaluation. In the early 21st century, having evolved from its beginnings through and alongside various forms of anticritique in the 20th century, critique now finds itself in an age that favors some variant or other of postcritique. It remains to be seen whether this tendency, which suggests its obsolescence and superseding, marks the end of critique as some would wish or merely its latest metamorphosis and diversification in response to the multivalent pressures of digital acceleration and ecological crisis. Whatever path or paths contemporary judgment on this question may follow, critique as the name of a series of techniques and operations guided by a desire for certain ends is likely to remain one of the most consistent ways of surveying any particular field of intellectual endeavor and the relations between adjacent or even divergent fields in terms of their commonalities and differences. As Kant and Voltaire understood so well of their own age, modernity is characterized in the first instance by its will to criticism and then by the systematic criticism of the conditions for that criticism. By the same token now in late or post- or neo-modernity, if contemporary conversations about literature and its pleasures, challenges, study, and criticism require an overview, then some version of critique or its legacy will undoubtedly still come into play.
Article
E-text
Niels Ole Finnemann
Electronic text can be defined on two different, though interconnected, levels. On the one hand, electronic text can be defined by taking the notion of “text” or “printed text” as the point of departure. On the other hand, electronic text can be defined by taking the digital format as the point of departure, where everything is represented in the binary alphabet. While the notion of text in most cases lends itself to being independent of medium and embodiment, it is also often tacitly assumed that it is in fact modeled on the print medium, instead of, for instance, on hand-written text or speech. In late 20th century, the notion of “text” was subjected to increasing criticism, as can be seen in the question that has been raised in literary text theory about whether “there is a text in this class.” At the same time, the notion was expanded by including extralinguistic sign modalities (images, videos). A basic question, therefore, is whether electronic text should be included in the enlarged notion that text is a new digital sign modality added to the repertoire of modalities or whether it should be included as a sign modality that is both an independent modality and a container that can hold other modalities. In the first case, the notion of electronic text would be paradigmatically formed around the e-book, which was conceived as a digital copy of a printed book but is now a deliberately closed work. Even closed works in digital form will need some sort of interface and hypertextual navigation that together constitute a particular kind of paratext needed for accessing any sort of digital material.
In the second case, the electronic text is defined by the representation of content and (some parts of the) processing rules as binary sequences manifested in the binary alphabet. This wider notion would include, for instance, all sorts of scanning results, whether of the outer cosmos or the interior of our bodies and of digital traces of other processes in-between (machine readings included). Since other alphabets, such as the genetic alphabet and all sorts of images may also be represented in the binary alphabet, such materials will also belong to the textual universe within this definition. A more intriguing implication is that born-digital materials may also include scripts and interactive features as intrinsic parts of the text.
The two notions define the text on different levels: one is centered on the Latin, the other on the binary alphabet, and both definitions include hypertext, interactivity, and multimodality as constituent parameters. In the first case, hypertext is included as a navigational, paratextual device; whereas in the second case, hypertext is also incorporated in the narrative within an otherwise closed work or as a constituent element on the textual universe of the web, where it serves the ongoing production of (possibly scripted) connections and disconnections between blocks of textual content. Since the early decades of early 21st century still represent only the very early stages of the globally distributed universe of web texts, this is also a history of the gradual unfolding of the dimensions of these three constituencies—hypertext, interactivity, and multimodality. The result is a still-expanding repertoire of genres, including some that are emerging via path dependency; some via remediation; and some as new genres that are unique for networked digital media, including “social media texts” and a growing variety of narrative and discursive multiple-source systems.
Article
The Flores Magón Brothers and Magonismo on the Borderlands
Luis A. Marentes
Early critics of the Porfirio Díaz regime and editors of the influential newspaper Regeneración, Ricardo and Enrique Flores Magón escaped to the United States in 1904. Here, with Ricardo as the leader and most prolific writer, they founded the Partido Liberal Mexicano (PLM) in 1906 and facilitated oppositional transnational networks of readers, political clubs, and other organizations. From their arrival they were constantly pursued and imprisoned by coordinated Mexican and US law enforcement and private detective agencies, but their cause gained US radical and worker support. With the outbreak of the 1910 Mexican Revolution the PLM splintered, with many members joining Madero’s forces, while the Flores Magón brothers and the PLM nucleus refused to compromise. They had moved beyond a liberal critique of a dictatorship to an anarchist oppositional stance to the state and private property. While not called Magonismo at the time, their ideological and organizational principles left a legacy in both Mexico and the United States greatly associated with the brothers. During World War I, a time of a growing nativist red scare in the United States, they turned from a relative nuisance to a foreign radical threat to US authorities. Ricardo died in Leavenworth federal penitentiary in 1922 and Enrique was deported to Mexico, where he promoted the brothers’ legacy within the postrevolutionary order. Although the PLM leadership opposed the new regime, their 1906 Program inspired much of the 1917 Constitution, and several of their comrades played influential roles in the new regime. In the United States many of the networks and mutual aid initiatives that engaged with the Flores Magón brothers continued to bear fruit, well into the emergence of the Chicana/o Movement.
Article
Latinx Popular Culture and Social Conflict: Comics, Graphic Novels, and Film
Frederick Luis Aldama
Despite Latinxs being the largest growing demographic in the United States, their experiences and identities continue to be underrepresented and misrepresented in the mainstream pop cultural imaginary. However, for all the negative stereotypes and restrictive ways that the mainstream boxes in Latinxs, Latinx musicians, writers, artists, comic book creators, and performers actively metabolize all cultural phenomena to clear positive spaces of empowerment and to make new perception, thought, and feeling about Latinx identities and experiences. It is important to understand, though, that Latinxs today consume all variety of cultural phenomena. For corporate America, therefore, the Latinx demographic represents a huge buying demographic. Viewed through cynical and skeptical eyes, increased representation of Latinxs in mainstream comic books and film results from this push to capture the Latinx consumer market. Within mainstream comic books and films, Latinx subjects are rarely the protagonists. However, Latinx comic book and film creators are actively creating Latinx protagonists within richly rendered Latinx story worlds. Latinx comic book and film creators work in all the storytelling genres and modes (realism, sci-fi, romance, memoir, biography, among many others) to clear new spaces for the expression of Latinx subjectivities and experiences.
Article
Modern Manuscripts
Dirk Van Hulle
The study of modern manuscripts to examine writing processes is termed “genetic criticism.” A current trend that is sometimes overdramatized as “the archival turn” is a result of renewed interest in this discipline, which has a long tradition situated at the intersection between modern book history, bibliography, textual criticism, and scholarly editing. Handwritten documents are called “modern” manuscripts to distinguish them from medieval or even older manuscripts. Whereas most extant medieval manuscripts are scribal copies and fit into a context of textual circulation and dissemination, modern manuscripts are usually autographs for private use. Traditionally, the watershed between older and “modern” manuscripts is situated around the middle of the 18th century, coinciding with the rise of the so-called Geniezeit, the Sturm und Drang (Storm and Stress) period in which the notion of “genius” became fashionable. Authors such as Goethe carefully preserved their manuscripts. This new interest in authors’ manuscripts can be part of the “genius” ideology: since a draft was regarded as the trace of a thought process, a manuscript was the tangible evidence of capital-G “Genius” at work. But this division between modern and older manuscripts needs to be nuanced, for there are of course autograph manuscripts with cancellations and revisions from earlier periods, which are equally interesting for manuscript research. Genetic criticism studies the dynamics of creative processes, discerning a difference between the part of the genesis that takes place in the author’s private environment and the continuation of that genesis after the work has become public. But the genesis is often not a linear development “before” and “after” publication; rather, it can be conceptualized by means of a triangular model. The three corners of that model are endogenesis (the “inside” of a writing process, the writing of drafts), exogenesis (the relation to external sources of inspiration), and epigenesis (the continuation of the genesis and revision after publication). At any point in the genesis there is the possibility that exogenetic material may color the endo- or the epigenesis. In the digital age, archival literary documents are no longer coterminous with a material object. But that does not mean the end of genetic criticism. On the contrary, an exciting future lies ahead. Born-digital works require new methods of analysis, including digital forensics, computer-assisted collation, and new forms of distant reading. The challenge is to connect to methods of digital text analysis by finding ways to enable macroanalysis across versions.
Article
Policing and Publishing in Modernist 20th-Century America
Claire A. Culleton
For almost four decades, from 1936 to 1972, the director of the Federal Bureau of Investigation, J. Edgar Hoover, fueled by intense paranoia and fear, hounded and relentlessly pursued a variety of American writers and publishers in a staunch effort to control the dissemination of literature that he thought threatened the American way of life. In fact, beginning as early as the Red Scare of 1919, he managed to control literary modernism by bullying and harassing writers and artists at a time when the movement was spreading quickly in the hands of an especially young, vibrant collection of international writers, editors, and publishers. He, his special agents in charge, and their field agents worked to manipulate the relationship between state power and modern literature, thereby “federalizing,” to a point, political surveillance. There still seems to be a resurgence of brute state force that is omnipresent and going through all matters and aspects of our private lives. We are constantly under surveillance, tracked, and monitored when engaged in even the most mundane activities. The only way to counter our omnipresent state surveillance is to monitor the monitors themselves.
Article
Posthumous Editing in the Modern United States
Allison Fagan
Posthumous publication is part of a long-standing literary tradition that crosses centuries and continents, giving works of art ranging from The Canterbury Tales to The Diary of Anne Frank, from Northanger Abbey to 2666. Preparing for print work that was incomplete and unpublished at the time of the author’s death, posthumous editing is a type of public and goal-oriented grieving that seeks to establish or preserve the legacy of a writer no longer able to establish it for herself. Surrounding the work of posthumous editing are questions of authorial intent, editorial and publisher imperative, and reader response, each shaping the degree to which a posthumously published edition of a text is considered valuable. The visibility of the work of such editing spans from conspicuously absent to noticeably transformative, suggesting a wide range of possibilities for imagining the editorial role in producing the posthumous text. Examples drawn from 20th- and 21st-century US literature reveal the nature of editorial relationships to the deceased as well as the subsequent relationships of readers to the posthumously published text.
Article
Reading in the Digital Era
Lutz Koepnick
Digital reading has been an object of fervent scholarly and public debates since the mid-1990s. Often digital reading has been associated solely with what may happen between readers and screens, and in dominant approaches digital reading devices have been seen as producing radically different readers than printed books produce.
Far from merely reducing digital reading to a mere matter of what e-books might do to the attention spans of individual readers, however, contemporary critiques emphasize how digital computing affects and is being affected by neurological, sensory, kinetic, and apparatical processes. The future of reading has too many different aspects to be discussed by scholars of one discipline or field of study alone. Digital reading is as much a matter for neurologists as for literary scholars, for engineers as much as ergonomicians, for psychologists, physiologists, media historians, art critics, critical theorists, and many others. Scholars of literature will need to consult many fields to elaborate a future poetics of digital reading and examine how literary texts in all their different forms are and will be met by 21st-century readers.
Article
Reception in the Digital Era
DeNel Rehberg Sedo
The digital era offers a plethora of opportunities for readers to exchange opinions, share reading recommendations, and form ties with other readers. This communication often takes place in online environments, which presents reading researchers with new opportunities and challenges when investigating readers’ reading experiences.
What readers do with what they read is not a new topic of scholarly debate. As early as the 14th century, when scribes questioned how their readers understood their words, readers have been scrutinized. Contemporary reading investigations and theory formation began in earnest in the 1920s with I. A. Richards’s argument that the reader should be considered separate from the text. In the 1930s, Louise Rosenblatt furthered the discipline, using literature as an occasion for collective inquiry into both cultural and individual values and introducing the concerns for the phenomenological experience of reading and its intersubjectivity. While there is no universal theory of how readers read, more recent scholarly discourse illustrates a cluster of related views that see the reader and the text as complementary to one another in a variety of critical contexts.
With the advent of social media and Web 2.0, readers provide researchers with a host of opportunities to not only identify who they are, but to access in profound ways their individual and collective responses to the books they read. Reader responses on the Internet’s early email forums, or the contemporary iterations of browser-hosted groups such as Yahoo Groups or Google Groups, alongside book talk found on platforms such as Twitter, Facebook, and YouTube, present data that can be analyzed through established or newly developed digital methods. Reviews and commentary on these platforms, in addition to the thousands of book blogs, Goodreads.com, LibraryThing.com, and readers’ reviews on bookseller websites illustrate cultural, economic, and social aspects of reading in ways that previously were often elusive to reading researchers.
Contemporary reading scholars bring to the analytical mix perspectives that enrich last century’s theories of unidentified readers. The methods illustrate the fertility available to contemporary investigations of readers and their books. Considered together, they allow scholars to contemplate the complexities of reading in the past, highlight the uniqueness of reading in the present, and provide material to help project into the future.
Article
Reception Theory, Reception History, Reception Studies
Ika Willis
Reception-oriented literary theory, history, and criticism, all analyze the processes by which literary texts are received, both in the moment of their first publication and long afterwards: how texts are interpreted, appropriated, adapted, transformed, passed on, canonized, and/or forgotten by various audiences. Reception draws on multiple methodologies and approaches including semiotics and deconstruction; ethnography, sociology, and history; media theory and archaeology; and feminist, Marxist, black, and postcolonial criticism. Studying reception gives us insights into the texts themselves and their possible range of meanings, uses, and value; into the interpretative regimes of specific historical periods and cultural milieux; and into the nature of linguistic meaning and communication.
Article
Textual Studies
Mark Byron
Textual studies describes a range of fields and methodologies that evaluate how texts are constituted both physically and conceptually, document how they are preserved, copied, and circulated, and propose ways in which they might be edited to minimize error and maximize the text’s integrity. The vast temporal reach of the history of textuality—from oral traditions spanning thousands of years and written forms dating from the 4th millenium bce to printed and digital text forms—is matched by its geographical range covering every linguistic community around the globe. Methods of evaluating material text-bearing documents and the reliability of their written or printed content stem from antiquity, often paying closest attention to sacred texts as well as to legal documents and literary works that helped form linguistic and social group identity. With the incarnation of the printing press in the early modern West, the rapid reproduction of text matter in large quantities had the effect of corrupting many texts with printing errors as well as providing the technical means of correcting such errors more cheaply and quickly than in the preceding scribal culture.
From the 18th century, techniques of textual criticism were developed to attempt systematic correction of textual error, again with an emphasis on scriptural and classical texts. This “golden age of philology” slowly widened its range to consider such foundational medieval texts as Dante’s Commedia as well as, in time, modern vernacular literature. The technique of stemmatic analysis—the establishment of family relationships between existing documents of a text—provided the means for scholars to choose between copies of a work in the pursuit of accuracy. In the absence of original documents (manuscripts in the hand of Aristotle or the four Evangelists, for example) the choice between existing versions of a text were often made eclectically—that is, drawing on multiple versions—and thus were subject to such considerations as the historic range and geographical diffusion of documents, the systematic identification of common scribal errors, and matters of translation.
As the study of modern languages and literatures consolidated into modern university departments in the later 19th century, new techniques emerged with the aim of providing reliable literary texts free from obvious error. This aim had in common with the preceding philological tradition the belief that what a text means—discovered in the practice of hermeneutics—was contingent on what the text states—established by an accurate textual record that eliminates error by means of textual criticism. The methods of textual criticism took several paths through the 20th century: the Anglophone tradition centered on editing Shakespeare’s works by drawing on the earliest available documents—the printed Quartos and Folios—developing into the Greg–Bowers–Tanselle copy-text “tradition” which was then deployed as a method by which to edit later texts. The status of variants in modern literary works with multiple authorial manuscripts—not to mention the existence of competing versions of several of Shakespeare’s plays—complicated matters sufficiently that editors looked to alternate editorial models. Genetic editorial methods draw in part on German editorial techniques, collating all existing manuscripts and printed texts of a work in order to provide a record of its composition process, including epigenetic processes following publication. The French methods of critique génétique also place the documentary record at the center, where the dossier is given priority over any one printed edition, and poststructuralist theory is used to examine the process of “textual invention.” The inherently social aspects of textual production—the author’s interaction with agents, censors, publishers, and printers and the way these interactions shape the content and presentation of the text—have reconceived how textual authority and variation are understood in the social and economic contexts of publication. And, finally, the advent of digital publication platforms has given rise to new developments in the presentation of textual editions and manuscript documents, displacing copy-text editing in some fields such as modernism studies in favor of genetic or synoptic models of composition and textual production.