1-11 of 11 Results  for:

  • British and Irish Literatures x
  • Literary Theory x
  • 20th and 21st Century (1900-present) x
Clear all

Article

The Chapter  

Nicholas Dames

First known as a kephalaion in Greek, capitulum or caput in Latin, the chapter arose in antiquity as a finding device within long, often heterogenous prose texts, prior even to the advent of the codex. By the 4th century ce, it was no longer unusual for texts to be composed in capitula; but it is with the advent of the fictional prose narratives we call the novel that the chapter, both ubiquitous and innocuous, developed into a compositional practice with a distinct way of thinking about biographical time. A technique of discontinuous reading or “consultative access” which finds a home in a form for continuous, immersive reading, the chapter is a case study in adaptive reuse and slow change. One of the primary ways the chapter became a narrative form rather than just an editorial practice is through the long history of the chaptering of the Bible, particularly the various systems for chaptering the New Testament, which culminated in the early 13th century formation of the biblical chaptering system still in use across the West. Biblical chapters formed a template for how to segment ongoing plots or actions which was taken up by writers, printers, and editors from the late medieval period onward; pivotal examples include William Caxton’s chaptering of Thomas Malory’s Morte d’Arthur in his 1485 printing of the text, or the several mises en proses of Chrétien de Troyes’s poems carried out in the Burgundian court circle of the 15th century. By the 18th century, a vibrant set of discussions, controversies, and experiments with chapters were characteristic of the novel form, which increasingly used chapter titles and chapter breaks to meditate upon how different temporal units understand human agency in different ways. With the eventual dominance of the novel in 19th-century literary culture, the chapter had been honed into a way of thinking about the segmented nature of biographical memory, as well as the temporal frames—the day, the year, the episode or epoch—in which that segmenting occurs; chapters in this period were of an increasingly standard size, although still lacking any formal rules or definition. Modernist prose narratives often played with the chapter form, expanding it or drastically shortening it, but these experiments usually tended to reaffirm the unit of the chapter as a significant measure by which we make sense of human experience.

Article

Close Reading  

Mark Byron

Close reading describes a set of procedures and methods that distinguishes the scholarly apprehension of textual material from the more prosaic reading practices of everyday life. Its origins and ancestry are rooted in the exegetical traditions of sacred texts (principally from the Hindu, Jewish, Buddhist, Christian, Zoroastrian, and Islamic traditions) as well as the philological strategies applied to classical works such as the Homeric epics in the Greco-Roman tradition, or the Chinese 詩經 (Shijing) or Classic of Poetry. Cognate traditions of exegesis and commentary formed around Roman law and the canon law of the Christian Church, and they also find expression in the long tradition of Chinese historical commentaries and exegeses on the Five Classics and Four Books. As these practices developed in the West, they were adapted to medieval and early modern literary texts from which the early manifestations of modern secular literary analysis came into being in European and American universities. Close reading comprises the methodologies at the center of literary scholarship as it developed in the modern academy over the past one hundred years or so, and has come to define a central set of practices that dominated scholarly work in English departments until the turn to literary and critical theory in the late 1960s. This article provides an overview of these dominant forms of close reading in the modern Western academy. The focus rests upon close reading practices and their codification in English departments, although reference is made to non-Western reading practices and philological traditions, as well as to significant nonanglophone alternatives to the common understanding of literary close reading.

Article

Daemonic  

Angus Nicholls

The term daemonic—often substantivized in German as the daemonic (das Dämonische) since its use by Johann Wolfgang von Goethe in the early 19th century—is a literary topos associated with divine inspiration and the idea of genius, with the nexus between character and fate and, in more orthodox Christian manifestations, with moral transgression and evil. Although strictly modern literary uses of the term have become prominent only since Goethe, its origins lie in the classical idea of the δαíμων, transliterated into English as daimon or daemon, as an intermediary between the earthly and the divine. This notion can be found in pre-Socratic thinkers such as Empedocles and Heraclitus, in Plato, and in various Stoic and Neo-Platonic sources. One influential aspect of Plato’s presentation of the daemonic is found in Socrates’s daimonion: a divine sign, voice, or hint that dissuades Socrates from taking certain actions at crucial moments in his life. Another is the notion that every soul contains an element of divinity—known as its daimon—that leads it toward heavenly truth. Already in Roman thought, this idea of an external voice or sign begins to be associated with an internal genius that belongs to the individual. In Christian thinking of the European romantic period, the daemonic in general and the Socratic daimonion in particular are associated with notions such as non-rational divine inspiration (for example, in Johann Georg Hamann and Johann Gottfried Herder) and with divine providence (for example, in Joseph Priestley). At the same time, the daemonic is also often interpreted as evil or Satanic—that is: as demonic—by European authors writing in a Christian context. In Russia in particular, during a period spanning from the mid-19th century until the early 20th century, there is a rich vein of novels, including works by Gogol and Dostoevsky, that deal with this more strictly Christian sense of the demonic, especially the notion that the author/narrator may be a heretical figure who supplants the primacy of God’s creation. But the main focus of this article is the more richly ambivalent notion of the daemonic, which explicitly combines both the Greco-Roman and Judeo-Christian heritages of the term. This topos is most prominently mobilized by two literary exponents during the 19th century: Goethe, especially in his autobiography Dichtung und Wahrheit (Poetry and Truth), and Samuel Taylor Coleridge, in his Notebooks and in the Lectures on the History of Philosophy. Both Goethe’s and Coleridge’s treatments of the term, alongside its classical and Judeo-Christian heritages, exerted an influence upon literary theory of the 20th century, leading important theorists such as Georg Lukács, Walter Benjamin, Hans Blumenberg, Angus Fletcher, and Harold Bloom to associate the daemonic with questions concerning the novel, myth, irony, allegory, and literary influence.

Article

Ekphrasis  

Gabriele Rippl

Ekphrasis is a Greek term whose etymological meaning is “to speak out” or “to show in full.” Debates on ekphrasis go back to classical antiquity and Homer’s lines on Hephaestos making Achilles’ shield in Book 18 of the epic The Iliad (8th century bce). Ekphrasis was considered a mode of speaking capable of bringing absent things before the listener’s inner eye by aiming at enargeia, a vivid quality of language producing evidentia (evidence) and rousing emotions through lively, precise, and detailed verbal descriptions. Over the centuries, the term underwent a considerable narrowing-down of its original meaning and eventually, during the Second Sophistic, came to designate the description of works of art. However, ancient ekphrasis, in the broader sense of detailed and lively description, had a rich afterlife throughout the Middle Ages (e.g., in Geoffrey Chaucer), the Renaissance (e.g., in Shakespeare), Neoclassicism (in Joseph Addison’s essays and Gotthold Ephraim Lessing’s “Laocoön”), and even into the Romantic Age (e.g., in William Wordsworth and George Gordon Byron). In its narrower sense as verbal representation/evocation of or response to a work of art or visual object, it is a ubiquitous phenomenon in 19th-, 20th-, and 21st-century literature, be it poetry or narrative fiction. Many modernist, postmodernist, and post-postmodernist literary texts are replete with ekphrases, but these ekphrases very often question any mimetic or illusionist aesthetic and no longer exclusively follow the paragonal model: instead of competing with one another, ekphrastic word-image configurations are more adequately described as intermedial constellations and collaborations. As a pertinent feature of 20th- and 21st-century poetry and narrative fiction—examples are novels by Julian Barnes, Antonia Susan Byatt, Teju Cole, Siri Hustvedt, or Donna Tartt—ekphrasis has also attracted the attention of literary scholars and theoreticians of culture. Due to the many attempts to conceptualize and theorize ekphrasis, any attempt to give a simple definition will not suffice. In the 1980s and 1990s scholars such as Murray Krieger, William John Thomas Mitchell, and James Heffernan theorized ekphrasis: while Krieger saw ekphrasis as a symptom of the semiotic desire for the natural sign and Mitchell discussed ekphrasis within a paragonal framework of socio-cultural power relations, Heffernan defined ekphrasis as the verbal representation of visual representation. Included among the seminal concepts and definitions of ekphrasis in the early 21st century are approaches from phenomenology and cognitive poetics or new reception aesthetics, the digital humanities, postcolonial and transcultural studies, and the environmental humanities. By going beyond questions of representation that have dominated ekphrastic criticism for a long time, functions of ekphrasis, in particular socio-cultural and ethical functions, have gained new attention.

Article

Grotesque  

Rune Graulund

Defining the grotesque in a concise and objective manner is notoriously difficult. When researching the term for his classic study On the Grotesque: Strategies of Contradiction in Art and Literature (1982), Geoffrey Galt Harpham observed that the grotesque is hard to pin down because it is defined as being in opposition to something rather than possessing any defining quality in and of itself. Any attempt to identify specific grotesque characteristics outside of a specific context is therefore challenging for two reasons. First, because the grotesque is that which transgresses and challenges what is considered normal, bounded, and stable, meaning that one of the few universal and fundamental qualities of the grotesque is that it is abnormal, unbounded, and unstable. Second, since even the most rigid norms and boundaries shift over time, that which is defined in terms of opposition and transgression will naturally change as well, meaning that the term grotesque meant very different things in different historical eras. For instance, as Olli Lagerspetz points out in A Philosophy of Dust (2018), while 16th-century aristocrats in France may routinely have received guests while sitting on their night stools, similar behavior exhibited today would surely be interpreted not only as out of the ordinary, but as grotesque. Likewise, perceptions of the normal and the abnormal vary widely even within the same time period, depending on one’s class, gender, race, profession, sexual orientation, cultural background, and so on.

Article

Literature and Science  

Michael H. Whitworth

Though “literature and science” has denoted many distinct cultural debates and critical practices, the historicist investigation of literary-scientific relations is of particular interest because of its ambivalence toward theorization. Some accounts have suggested that the work of Bruno Latour supplies a necessary theoretical framework. An examination of the history of critical practice demonstrates that many concepts presently attributed to or associated with Latour have been longer established in the field. Early critical work, exemplified by Marjorie Hope Nicolson, tended to focus one-sidedly on the impact of science on literature. Later work, drawing on Thomas Kuhn’s idea of paradigm shifts, and on Mary Hesse’s and Max Black’s work on metaphor and analogy in science, identified the scope for a cultural influence on science. It was further bolstered by the “strong program” in the sociology of scientific knowledge, especially the work of Barry Barnes and David Bloor. It found ways of reading scientific texts for the traces of the cultural, and literary texts for traces of science; the method is implicitly modeled on psychoanalysis. Bruno Latour’s accounts of literary inscription, black boxing, and the problem of explanation have precedents in the critical practices of critics in the field of literature and science from the 1980s onward.

Article

Lyric Poetry and Poetics  

Daniel Tiffany

Lyric poetry is an ancient genre, enduring to the present day, but it is not continuous in its longevity. What happens to lyric poetry and how it changes during its numerous and sometimes lengthy periods of historical eclipse (such as the 18th century) may be as important to our understanding of lyric as an assessment of its periods of high achievement. For it is during these periods of relative obscurity that lyric enters into complex relations with other genres of poetry and prose, affirming the general thesis that all genres are relational and porous. The question of whether any particular properties of lyric poetry endure throughout its 2,700-year checkered history can be addressed by examining its basic powers: its forms; its figurative and narrative functions; and its styles and diction. The hierarchy of these functions is mutable, as one finds in today’s rift between a scholarly revival of formalist analysis and the increasing emphasis on diction in contemporary poetry. As a way of assessing lyric poetry’s basic operations, the present article surveys the ongoing tension between form and diction by sketching a critique of the tenets of New Formalism in literary studies, especially its presumptions about the relation of poetic form to the external world and its tendency to subject form to close analysis, as if it could yield, like style or diction, detailed knowledge of the world. Long overshadowed by the doctrinal tenets of modernist formalism, the expressive powers of diction occupy a central place in contemporary concerns about identity and social conflict, at the same time that diction (unlike form) is especially susceptible to the vocabularistic methods of “distant reading”—to the computational methods of the digital humanities. The indexical convergence of concreteness and abstraction, expression and rationalism, proximity and distance, in these poetic and scholarly experiments with diction points to precedents in the 18th century, when the emergence of Anglophone poetries in the context of colonialism and the incorporation of vernacular languages into poetic diction (via the ballad revival) intersected with the development of modern lexicography and the establishment of Standard English. The nascent transactions of poetics and positivism through the ontology of diction in the 21st century remind us that poetic diction is always changing but also that the hierarchy of form, figuration, and diction in lyric poetry inevitably shifts over time—a reconfiguration of lyric priorities that helps to shape the premises and methods of literary studies.

Article

Modern Manuscripts  

Dirk Van Hulle

The study of modern manuscripts to examine writing processes is termed “genetic criticism.” A current trend that is sometimes overdramatized as “the archival turn” is a result of renewed interest in this discipline, which has a long tradition situated at the intersection between modern book history, bibliography, textual criticism, and scholarly editing. Handwritten documents are called “modern” manuscripts to distinguish them from medieval or even older manuscripts. Whereas most extant medieval manuscripts are scribal copies and fit into a context of textual circulation and dissemination, modern manuscripts are usually autographs for private use. Traditionally, the watershed between older and “modern” manuscripts is situated around the middle of the 18th century, coinciding with the rise of the so-called Geniezeit, the Sturm und Drang (Storm and Stress) period in which the notion of “genius” became fashionable. Authors such as Goethe carefully preserved their manuscripts. This new interest in authors’ manuscripts can be part of the “genius” ideology: since a draft was regarded as the trace of a thought process, a manuscript was the tangible evidence of capital-G “Genius” at work. But this division between modern and older manuscripts needs to be nuanced, for there are of course autograph manuscripts with cancellations and revisions from earlier periods, which are equally interesting for manuscript research. Genetic criticism studies the dynamics of creative processes, discerning a difference between the part of the genesis that takes place in the author’s private environment and the continuation of that genesis after the work has become public. But the genesis is often not a linear development “before” and “after” publication; rather, it can be conceptualized by means of a triangular model. The three corners of that model are endogenesis (the “inside” of a writing process, the writing of drafts), exogenesis (the relation to external sources of inspiration), and epigenesis (the continuation of the genesis and revision after publication). At any point in the genesis there is the possibility that exogenetic material may color the endo- or the epigenesis. In the digital age, archival literary documents are no longer coterminous with a material object. But that does not mean the end of genetic criticism. On the contrary, an exciting future lies ahead. Born-digital works require new methods of analysis, including digital forensics, computer-assisted collation, and new forms of distant reading. The challenge is to connect to methods of digital text analysis by finding ways to enable macroanalysis across versions.

Article

Realisms  

Alison Shonkwiler

Realism is a historical phenomenon that is not of the past. Its recurrent rises and falls only attest to its persistence as a measure of representational authority. Even as literary history has produced different moments of “realism wars,” over the politics of realist versus antirealist aesthetics, the demand to represent an often strange and changing reality—however contested a term that may be—guarantees realism’s ongoing critical future. Undoubtedly, realism has held a privileged position in the history of Western literary representation. Its fortunes are closely linked to the development of capitalist modernity, the rise of the novel, the emergence of the bourgeoisie, and the expansion of middle-class readerships with the literacy and leisure to read—and with an interest in reading about themselves as subjects. While many genealogies of realism are closely tied to the history of the rise of the novel—with Don Quixote as a point of departure—it is from its later, 19th-century forms that critical assumptions have emerged about its capacities and limitations. The 19th-century novel—whether its European or slightly later American version—is taken as the apex of the form and is tied to the rise of industrial capitalism, burgeoning ideas of social class, and expansion of empire. Although many of the realist writers of the 19th century were self-reflexive about the form, and often articulated theories of realism as distinct from romance and sentimental fiction, it was not until the mid-20th century, following the canonization of modernism in English departments, that a full-fledged critical analysis of realism as a form or mode would take shape. Our fullest articulations of realism therefore owe a great deal to its negative comparison to later forms—or, conversely, to the effort to resuscitate realism’s reputation against perceived critical oversimplifications. In consequence, there is no single definition of realism—nor even agreement on whether it is a mode, form, or genre—but an extraordinarily heterogenous set of ways of approaching it as a problem of representation. Standard early genealogies of realism are to be found in historical accounts such as Ian Watt’s The Rise of the Novel and György Lukács’ Theory of the Novel and The Historical Novel, with a guide to important critiques and modifications to be found in Michael McKeon’s Theory of the Novel. This article does not retrace those critical histories. Nor does it presume to address the full range of realisms in the modern arts, including painting, photography, film, and video and digital arts. It focuses on the changing status of realism in the literary landscape, uses the fault lines of contemporary critical debates about realism to refer back to some of the recurrent terms of realism/antirealism debates, and concludes with a consideration of the “return” to realism in the 21st century.

Article

Satire  

Emmett Stinson

Although scholars generally agree that satire cannot be defined in a categorical or exhaustive way, there is a consensus regarding its major features: satire is a mode, rather than a genre; it attacks historically specific targets, who are real; it is an intentional and purposeful literary form; its targets deserve ridicule on the basis of their behavior; and satire is both humorous and critical by its nature. The specificity and negativity of satire are what separates it from comedy, which tends to ridicule general types of people in ways that are ultimately redemptive. Satire is also rhetorically complex, and its critiques have a convoluted or indirect relation to the views of the author. Satire’s long history, which is not straightforwardly linear, means that it is impossible to catalogue all of the views on it from antiquity through to modernity. Modern criticism on satire, however, is easier to summarize and has often made use of ancient satirical traditions for its own purposes—especially because many early modern theorists of satire were also satirists. In particular, modern satire has generated an internal dichotomy between a rhetorical tradition of satire associated with Juvenal, and an ethical tradition associated with Horace. Most criticism of satire from the 20th century onward repeats and re-inscribes this binary in various ways. The Yale school of critics applied key insights from the New Critics to offer a rhetorical approach to satire. The Chicago school focused on the historical nature of satirical references but still presented a broadly formalist account of satire. Early 21st century criticism has moved between a rhetorical approach inflected by poststructural theory and a historicism grounded in archival research, empiricism, and period studies. Both of these approaches, however, have continued to internally reproduce a division between satire’s aesthetic qualities and its ethical or instrumental qualities. Finally, there is also a tradition of Menippean satire that differs markedly in character from traditional satire studies. While criticism of Menippean satire tends to foreground the aesthetic potential of satire over and above ethics, it also often focuses on many works that are arguably not really satirical in nature.

Article

Textual Studies  

Mark Byron

Textual studies describes a range of fields and methodologies that evaluate how texts are constituted both physically and conceptually, document how they are preserved, copied, and circulated, and propose ways in which they might be edited to minimize error and maximize the text’s integrity. The vast temporal reach of the history of textuality—from oral traditions spanning thousands of years and written forms dating from the 4th millenium bce to printed and digital text forms—is matched by its geographical range covering every linguistic community around the globe. Methods of evaluating material text-bearing documents and the reliability of their written or printed content stem from antiquity, often paying closest attention to sacred texts as well as to legal documents and literary works that helped form linguistic and social group identity. With the incarnation of the printing press in the early modern West, the rapid reproduction of text matter in large quantities had the effect of corrupting many texts with printing errors as well as providing the technical means of correcting such errors more cheaply and quickly than in the preceding scribal culture. From the 18th century, techniques of textual criticism were developed to attempt systematic correction of textual error, again with an emphasis on scriptural and classical texts. This “golden age of philology” slowly widened its range to consider such foundational medieval texts as Dante’s Commedia as well as, in time, modern vernacular literature. The technique of stemmatic analysis—the establishment of family relationships between existing documents of a text—provided the means for scholars to choose between copies of a work in the pursuit of accuracy. In the absence of original documents (manuscripts in the hand of Aristotle or the four Evangelists, for example) the choice between existing versions of a text were often made eclectically—that is, drawing on multiple versions—and thus were subject to such considerations as the historic range and geographical diffusion of documents, the systematic identification of common scribal errors, and matters of translation. As the study of modern languages and literatures consolidated into modern university departments in the later 19th century, new techniques emerged with the aim of providing reliable literary texts free from obvious error. This aim had in common with the preceding philological tradition the belief that what a text means—discovered in the practice of hermeneutics—was contingent on what the text states—established by an accurate textual record that eliminates error by means of textual criticism. The methods of textual criticism took several paths through the 20th century: the Anglophone tradition centered on editing Shakespeare’s works by drawing on the earliest available documents—the printed Quartos and Folios—developing into the Greg–Bowers–Tanselle copy-text “tradition” which was then deployed as a method by which to edit later texts. The status of variants in modern literary works with multiple authorial manuscripts—not to mention the existence of competing versions of several of Shakespeare’s plays—complicated matters sufficiently that editors looked to alternate editorial models. Genetic editorial methods draw in part on German editorial techniques, collating all existing manuscripts and printed texts of a work in order to provide a record of its composition process, including epigenetic processes following publication. The French methods of critique génétique also place the documentary record at the center, where the dossier is given priority over any one printed edition, and poststructuralist theory is used to examine the process of “textual invention.” The inherently social aspects of textual production—the author’s interaction with agents, censors, publishers, and printers and the way these interactions shape the content and presentation of the text—have reconceived how textual authority and variation are understood in the social and economic contexts of publication. And, finally, the advent of digital publication platforms has given rise to new developments in the presentation of textual editions and manuscript documents, displacing copy-text editing in some fields such as modernism studies in favor of genetic or synoptic models of composition and textual production.