801-820 of 910 Results

Article

Tate, James  

Arnold E. Sabatelli

James Tate is arguably one of the most influential poets of his generation. In 1967 he won the coveted Yale Younger Poets Award, one of the youngest writers ever to receive that honor. (He was a graduate student at the Iowa Writers Workshop at the time.) His book The Lost Pilot, published the same year, set the tone for the body of his poetry. Surreal, funny, irreverent, and at times almost wholly inaccessible, Tate's poetry has not strayed far from the approach and tone of his earliest work.

Article

Taylor, Edward  

Adam Scott Miller

During his lifetime of eighty-seven years, Edward Taylor, a Puritan minister and poet, wrote more than forty thousand lines of verse. Much of Taylor's poetry is devotional and was composed during the course of frequent meditative exercises. As a result, he chose to keep his work private but left his manuscripts to his grandson, who eventually deposited them in the Yale University library. They remained there until their discovery in 1937. Subsequent critical attention has declared Taylor's verse to be colonial America's best poetry.

Article

Taylor, Peter  

Robert Wilson

Because he was born in Tennessee and much of his work is set there, Peter Taylor is unquestionably a southern writer. But his fiction differs from that of the other significant writers of the southern literary renaissance of the 1920s through the 1960s in its focus on urban and suburban settings of the Upper South rather than on the rural life of the Deep South. Taylor was younger than William Faulkner, the great master of southern and, indeed, of American fiction, and younger than the members of the South's two preeminent literary groups, the Fugitive poets and the Agrarians. As a result, the shadows of the Civil War and Reconstruction fall less boldly upon his work than on the work of those older writers, who had one foot in the nineteenth century and one in the twentieth. Taylor's sympathetic concern with the circumstances of blacks and women place him firmly in the twentieth century. (His other large theme of class shows up in almost every period and school of American literature.)

Article

Technology  

Eleonora Lima

The history of literature has always been influenced by technological progress, as a transformative cultural power—threatening destruction or promising a luminous future—as a theme inspiring new narrative forms and plots, or as a force influencing the way authors conceive textuality and perform their creative work. The entanglement between literary and technological inventions is even recorded in the etymology of the word, which comes from the Greek “techne,” a term referring to arts as well as crafts. The way writers conceive this relationship, however, varies greatly: although some consider the work of technicians to be congenial to artistic creation, as they both demonstrate human creativity and ingenuity, others believe technology to be a dehumanizing and unnatural force, not only alien to literature but in competition with its own ethos. Therefore, depending on their position, the writer comes to embody the mythical figure of Prometheus, the first technician and defiant creator, or that of Orpheus, symbolizing the marriage between poetry and nature compared to any artificial creation. However, the opposition between nature and technology, with literature positioning itself either in one realm or the other, is only one of many possible critical perspectives. Indeed, when moving beyond the idea of technology as merely a kind of artifact, the affinities between texts and machines clearly emerge. A mutual relation connects technology and textuality, and this has to do with the complex nature of material and cultural objects, each shaped by social use, aesthetic norms, and power structures. This bond between discursivity and materiality is impossible to disentangle, as is the contextual relationship between literature and technology: Texts prescribe meanings to machines just as much as the latter shape their textuality. To recognize literature and technology as two different systems of meanings and sets of practices which are nevertheless always in conversation with each other is also to understand literature as technology. This stance has nothing to do with the likeness of the poet and the technician as creative minds but rather with the idea of literary texts functioning like technologies and, ultimately, offering a meta-reflexive analysis of their own textuality. According to this critical perspective, literature performatively enacts the changes in textuality brought about by technological progress, from the printing press to digital writing tools.

Article

Tekhne  

Ian James

Tekhne, or techne, is derived from the Greek term technê, meaning art, craft, technique, or skill, and plays an important role in Ancient Greek philosophy (in, for instance, Xenophon, Plato, Aristotle) where it is most often opposed to epistêmê, meaning knowledge. The legacy of the various Greek philosophical negotiations with, and distinctions between, technê and epistêmê leave a lasting mark on European thought and knowledge from the medieval period through to the early modern period and into modern philosophy from Emmanuel Kant onwards up to and including 20th-century phenomenology (Edmund Husserl, Martin Heidegger) and its subsequent legacy, particularly in French philosophy. So, for instance, in Plato’s Protagoras, the myth of Epimetheus and Prometheus describes the latter’s theft of the technê of fire as a result of the former’s forgetfulness with regard to the bestowal of attributes to human beings. Here technê emerges as skill or technique but also as a more general founding moment of humankind’s technical and technological capacities. In The Republic Plato opposes the knowledge of reality and truth (of ideal forms) to the representational status of dramatic poetry (as a technê poietike or productive technique) and by extension to arts and literature in general. In this context the latter have a degraded status in relation to knowledge or truth, and this sets the stage for attempts that will be made by later philosophy to distance itself from aesthetic form or literary discourse. In Aristotle technê emerges within the distinction between art as productive technique and theoretical knowledge on the one hand (theoria) and action on the other (praxis). Aristotle’s distinctions have an influential afterlife in the medieval period and into the early modern, in particular in Emmanuel Kant’s definition of art as a skill or capacity for the production of things. The legacy of this long negotiation of Greek technê as art, productive technique, technical skill, or technology finds its way into 20th-century German phenomenology; in Edmund Husserl’s account of the rise of the scientific worldview and instrumental rationality in The Crisis of European Sciences and Transcendental Phenomenology (1938) and in Martin Heidegger’s discourse on technological modernity, art, and the philosophical-poetic saying of being as it is developed from the 1930s onwards. The legacy of German phenomenological thinking relating to tekhne, understood as a fundamental dimension of both artistic and technological production, has a particularly strong afterlife in post–World War II French structuralism, poststructuralism, and contemporary philosophy. The influence of Husserl’s understanding of technicity can be traced directly in various ways into the work of, for instance, Jean-François Lyotard, Michel Foucault, and Jacques Derrida. Similarly, both Husserlian and Heideggerian discourse on tekhne find their way in the thinking of technology, ecotechnicity, and technics of contemporary philosophers such as Jean-Luc Nancy. Nancy’s discourse on the technicity of art yields an affirmation of the irreducible plurality of aesthetic techniques and, in particular, a reorientation of possible ways of understanding the place of literature in the age of digital information technology.

Article

Temporality  

Theodore Martin

Time is not a strictly literary category, yet literature is unthinkable without time. The events of a story unfold over time. The narration of that story imposes a separate order of time (chronological, discontinuous, in medias res). The reading of that narrative may take its own sweet time. Then there is the fact that literature itself exists in time. Transmitted across generations, literary texts cannot help but remind us of how times have changed. In doing so, they also show us how prior historical moments were indelibly shaped by their own specific philosophies and technologies of timekeeping—from the forms of sacred time that informed medieval writing; to the clash between national time and natural history that preoccupied the Romantics; to the technological standardization of time that shaped 19th-century literature; to the theories of psychological time that emerged in tandem with modernism; to the fragmented and foreshortened digital times that underlie postmodern fiction. Time, in short, shapes literature several times over: from reading experience to narrative form to cultural context. In this way, literature can be read as a peculiarly sensitive timepiece of its own, both reflecting and responding to the complex and varied history of shared time. Over the course of the 20th century, literary time has become an increasingly prominent issue for literary critics. Time was first installed at the heart of literary criticism by way of narrative theory and narratology, which sought to explain narrative’s irreducibly temporal structure. Soon, though, formalist and phenomenological approaches to time would give way to more historically and politically attuned methods, which have emphasized modern time’s enmeshment in imperialism, industrial capitalism, and globalization. In today’s critical landscape, time is a crucial and contested topic in a wide range of subfields, offering us indispensable insights into the history and ideology of modernity; the temporal politics of nationalism, colonialism, and racial oppression; the alternate timescales of environmental crisis and geological change; and the transformations of life and work that structure postmodern and postindustrial society.

Article

Textuality  

Rossana De Angelis

The concept of “text” is ambiguous: it can identify at the same time a concrete reality and an abstract one. Indeed, text presents itself both as an empirical object subject to analysis and an abstract object constructed by the analysis itself. This duplicity characterizes the development of the concept in the 20th century. According to different theories of language, there are also different understandings of “text”: a restricted use as written text, an extensive use as written and spoken text, and an expanded use as any written, verbal, gestural, or visual manifestation. The concept of “text” also presupposes two other concepts: from a generative point of view, it involves a proceeding by which something becomes a text (textualization); from an interpretative point of view, it involves a proceeding by which something can be interpreted as a text (textuality). In textual linguistics, “text” is considered at the same time as an abstract object, issued from a specific theoretical approach, and a concrete object, a linguistic phenomenon starting the process of analysis. In textual linguistics, textuality presents as a global quality of text issued from the interlacing of the sentences composing it. In linguistics, the definition of textuality depends on the definition of text. For instance, M. A. K. Halliday and Ruqaiya Hasan define textuality through the concepts of “cohesion” and “coherence.” Cohesion is a necessary condition of textuality, because it enables text to be perceived as a whole, but it’s not sufficient to explain it. In fact, to be interpreted as a whole, the elements composing the text need to be coherent to each other. But according to Robert-Alain De Beaugrande and Wolfgang Ulrich Dressler, cohesion and coherence are only two of the seven principles of textuality (the other five being intentionality, acceptability, informativity, situationality, and intertextuality). Textual pragmatics deals with a more complex problem: that of the text conceived as an empirical object. Here the text is presented as a unit captured in a communication process, “a communicative unit.” Considered from a pragmatic point of view, every single unit composing a text constitutes an instruction for meaning. Since the 1970s, analyzing connections between texts and contexts, textual pragmatics, has been an important source of inspiration for textual semiotics. In semiotics, the theory of language proposed by Louis T. Hjelmslev, the concept of “text” is conceived above all as a process and a “relational hierarchy.” Furthermore, according to Hjelmslev, textuality consists in the idea of “mutual dependencies,” composing a whole which makes the text an “absolute totality” to be interpreted by readers and analyzed by linguists. Since texts are composed of a network of connections at both local and global levels, their analyses depend on the possibility to reconstruct the relation between global and local dimensions. For this reason, François Rastier suggests that in order to capture the meaning of a text, the semantic analysis must identify semantic forms at different semantic levels. So textuality comes from the articulation between the semantic and phemic forms (content and expression), and from the semantic and phemic roots from which the forms emerge. Textuality allows the reader to identify the interpretative paths through which to understand the text. This complex dynamic is at the foundation of this idea of textuality. Now that digital texts are available, researchers have developed several methods and tools to exploit such digital texts and discourse, representing at the same time different approaches to meaning. Text Mining is based on a simple principle: the identification and processing of textual contents to extract knowledge. By using digital tools, the intra-textual and inter-textual links can be visualized on the screen, as lists or tables of results, which permits the analysis of the occurrences and frequency of certain textual elements composing the digital texts. So, another idea of text is visible to the linguist: not the classical one according to the culture of printed texts, but a new one typical of the culture of digital texts, and their textuality.

Article

Textual Studies  

Mark Byron

Textual studies describes a range of fields and methodologies that evaluate how texts are constituted both physically and conceptually, document how they are preserved, copied, and circulated, and propose ways in which they might be edited to minimize error and maximize the text’s integrity. The vast temporal reach of the history of textuality—from oral traditions spanning thousands of years and written forms dating from the 4th millenium bce to printed and digital text forms—is matched by its geographical range covering every linguistic community around the globe. Methods of evaluating material text-bearing documents and the reliability of their written or printed content stem from antiquity, often paying closest attention to sacred texts as well as to legal documents and literary works that helped form linguistic and social group identity. With the incarnation of the printing press in the early modern West, the rapid reproduction of text matter in large quantities had the effect of corrupting many texts with printing errors as well as providing the technical means of correcting such errors more cheaply and quickly than in the preceding scribal culture. From the 18th century, techniques of textual criticism were developed to attempt systematic correction of textual error, again with an emphasis on scriptural and classical texts. This “golden age of philology” slowly widened its range to consider such foundational medieval texts as Dante’s Commedia as well as, in time, modern vernacular literature. The technique of stemmatic analysis—the establishment of family relationships between existing documents of a text—provided the means for scholars to choose between copies of a work in the pursuit of accuracy. In the absence of original documents (manuscripts in the hand of Aristotle or the four Evangelists, for example) the choice between existing versions of a text were often made eclectically—that is, drawing on multiple versions—and thus were subject to such considerations as the historic range and geographical diffusion of documents, the systematic identification of common scribal errors, and matters of translation. As the study of modern languages and literatures consolidated into modern university departments in the later 19th century, new techniques emerged with the aim of providing reliable literary texts free from obvious error. This aim had in common with the preceding philological tradition the belief that what a text means—discovered in the practice of hermeneutics—was contingent on what the text states—established by an accurate textual record that eliminates error by means of textual criticism. The methods of textual criticism took several paths through the 20th century: the Anglophone tradition centered on editing Shakespeare’s works by drawing on the earliest available documents—the printed Quartos and Folios—developing into the Greg–Bowers–Tanselle copy-text “tradition” which was then deployed as a method by which to edit later texts. The status of variants in modern literary works with multiple authorial manuscripts—not to mention the existence of competing versions of several of Shakespeare’s plays—complicated matters sufficiently that editors looked to alternate editorial models. Genetic editorial methods draw in part on German editorial techniques, collating all existing manuscripts and printed texts of a work in order to provide a record of its composition process, including epigenetic processes following publication. The French methods of critique génétique also place the documentary record at the center, where the dossier is given priority over any one printed edition, and poststructuralist theory is used to examine the process of “textual invention.” The inherently social aspects of textual production—the author’s interaction with agents, censors, publishers, and printers and the way these interactions shape the content and presentation of the text—have reconceived how textual authority and variation are understood in the social and economic contexts of publication. And, finally, the advent of digital publication platforms has given rise to new developments in the presentation of textual editions and manuscript documents, displacing copy-text editing in some fields such as modernism studies in favor of genetic or synoptic models of composition and textual production.

Article

19th-Century Spirit Photography  

Cheryl Spinner

Spirit photography emerges out of the widespread movement of Spiritualism in the 19th century. In 1848, the Fox sisters of upstate New York claimed that the mysterious knockings emanating from the walls of their farmhouse represented the opening of a spirit telegraph that faciliated communication between the world of the living and the world of the dead. Spiritualism quickly became a techno-religious movement closely aligned with the abolitionist and suffragist movements. The movement utilized burgeoning technologies to apply a scientific rigor to phenomena beyond the five human senses. The photochemical process and the swift advancement of photography as both an art and science were particularly powerful mediums for providing evidence that spirits can manifest in the visible world. Sir John Herschel coined the term “photography” by combining the Greek words photos and graphé, literally “light writing” or “writing by light.” The term itself advances the concept that the camera produced an unmediated reproduction of the natural world, and, with the first spirit photograph emerging in 1862, believers understood that the camera was both capturing spirits of the dead and scientifically proving that the spirits were real. Nineteenth-century debates about the veracity of these images pivoted on the question of what photography was capable of capturing. Scientists knew that photography could capture invisible fluorescence, and Spiritualists argued that if the camera could capture the invisible world, then it could also capture spirits.

Article

1922: The Annus Mirabilis of Literary Modernism  

Michael Levenson

The year 1922 has been known as the annus mirabilis (“miracle year”) of Anglo-American literary modernism, chiefly because of the near-simultaneous publication of T.S. Eliot’s “The Waste Land,” James Joyce’s Ulysses, and Virginia Woolf’s Jacob’s Room. The distinctive historical character of 1922 remains an ongoing concern: the year was at once a time of traumatic memory of World War I and a moment of renewed ambition for the radical experiments of modernism. During the war, Eliot, Joyce, and Woolf had enjoyed an unusual opportunity to revise and extend their aesthetic ambitions. Each of their works registers the more defiant provocation of postwar literature, but each confronts the powerful resistance of cultural and political authorities who saw the efforts, especially of Eliot and Joyce, as both meaningless and dangerous. The postwar period also saw the rapid expansion of new technologies (especially in transport and telecommunications) and a consumer society keen to enjoy the availability of freshly circulating material goods. D. H. Lawrence described the end of war as both a relief and a menace. This double valence captures the contrast between searing memories of battlefield death and anticipation of pleasure and plenitude in the Jazz Age. The central figures in this entry are at once newly confident in the adversarial mission of modernism and fully aware of the social complacency and cultural conservatism arrayed against them. The immediate felt disturbance of these works came through their formal challenge, in particular through the intersecting uses of many-voiced and multi-perspectival montage, an assemblage of fragmentary views, and a diversity of speaking tones. This conspicuous technique appears in closely related terms within the early films of Dziga Vertov and the postwar philosophy of logical atoms developed by Bertrand Russell and Ludwig Wittgenstein. But the formal inventiveness exhibited during the year is no more prominent than the social concern. Especially as in 21st century, historical studies of the period have recovered the depth of interest in questions of race, empire, sexual debility, and social failure.

Article

The Arabic Novel: New Roots, New Routes  

Elizabeth Holt

In the mid-19th century, the Arabic novel emerged as a genre in Ottoman Syria and khedival Egypt. While this emergence has often been narrated as a story of the rise of nation-states and the diffusion of the European novel, the genre’s history and ongoing topography cannot be recovered without indexing the importance of Arabic storytelling and Islamic empire, ethics, and aesthetics to its roots. As the Arabic periodicals of Beirut and the Nile Valley, and soon Tunis and Baghdad, serialized and debated the rise of the novel form from the 19th century onward, historical, romantic, and translated novels found an avid readership throughout the Arab world and its diaspora. Metaphors of the garden confronted the maritime span of European empire in the 19th-century rise of the novel form in Arabic, and the novel’s path would continue to oscillate between the local and the global. British, French, Spanish, and Italian empire and direct colonial rule left a lasting imprint on the landscape of the region, and so too the investment of Cold War powers in its pipelines, oil wells, and cultural battlefields. Whether embracing socialist realism or avant-garde experimentation, the Arabic novel serves as an ongoing register of the stories that can be told in cities, villages, and nations throughout the region—from the committed novels interrogating the years of anticolonial national struggles and Arab nationalism in the 1950s and 1960s, through the ongoing history of war, surveillance, exile, occupation, and resource extraction that dictates the subsequent terrain of narration. The Arabic novel bears, too, an indelible mark left by translators of Arabic tales—from 1001 Nights to Girls of Riyadh—on the stories the region’s novelists tell.

Article

Theater in America  

Brenda Murphy

It is generally agreed that the post–World War II period produced the most significant American drama and theater. This included Tennessee Williams’s The Glass Menagerie (1945), A Streetcar Named Desire (1947), and Cat on a Hot Tin Roof (1955); Arthur Miller’s Death of a Salesman (1949) and The Crucible (1953); and Eugene O’Neill’s The Iceman Cometh (1946) and Long Day’s Journey into Night (written 1941, produced 1956). It was also the time when American theatrical production, characterized by a hybrid blend of realistic and modernist techniques known as “the American style,” was most influential. This period of extraordinary accomplishment would not have occurred without the particular theatrical developments that preceded it. American theater had gotten off to a slow start during the 18th and early 19th centuries, partly because of an anti-theatrical prejudice in the puritan roots of the Northeast, where most US cities were located, and the copyright situation, which made it much more profitable for theatrical managers to pirate English plays than to produce new American ones. During the mid-19th century, some native melodramas achieved popular success, but none entered the permanent repertoire except as curiosities. Toward the end of the 19th century, the realism of Henrik Ibsen and George Bernard Shaw began to have an impact, and by the 1920s, realism was the dominant dramatic and theatrical idiom of the American stage. At the same time, the impact of modernist techniques such as expressionism was being felt, and Eugene O’Neill and Susan Glaspell were writing avant-garde modernist plays such as O’Neill’s The Emperor Jones (1920) and The Hairy Ape (1922) and Glaspell’s The Verge (1921), which paved the way for O’Neill’s great experiments of the 1920s and 1930s, such as Strange Interlude (1928) and Mourning Becomes Electra (1931). For playwrights like Williams and Miller, it was a natural development to create a drama that united both of these strains, anchoring their plays in a realistic idiom but suffusing them with expressionist techniques that made it possible to dramatize a character’s consciousness on stage in juxtaposition with the external reality they must negotiate. The final decades of the 20th century may be characterized not so much by individual playwrights as by dramatic and theatrical developments. Escaping the intense commercial pressure of Broadway, the off-Broadway and off-off-Broadway theaters fostered the development of feminist and other experimental drama as well as the careers of playwrights such as Sam Shepard, Adrienne Kennedy, Maria Irene Fornés, and Lynn Nottage. Edward Albee, August Wilson, Ntozake Shange, and David Mamet came from alternative or regional theaters to achieve popular success on Broadway as well as critical acclaim. At the turn of the 21st century, American drama and theater reflected the heightened awareness of gender identity and ethnicity in the 1990s and the broadly eclectic aesthetics that would be evident in the next decades, a drama that is epitomized in Tony Kushner’s Angels in America (1992), which combines realistic characters, sociopolitical commentary, humor, and sentiment with fantasy, myth, and epic.

Article

The Booker Prize and Post-Imperial British Literature  

Chris Holmes

In the particular and peculiar case of the Booker Prize, regarded as the most prestigious literary award in the United Kingdom (as measured by economic value to the author and publisher, and total audience for the awards announcement), the cultural and economic valences of literary prizes collide with the imperial history of Britain, and its after-empire relationships to its former colonies. From its beginnings, the Booker prize has never been simply a British prize for writers in the United Kingdom. The Booker’s reach into the Commonwealth of Nations, a loose cultural and economic alliance of the United Kingdom and former British colonies, challenges the very constitution of the category of post-imperial British literature. With a history of winners from India, South Africa, New Zealand, and Nigeria, among many other former British colonies, the Booker presents itself as a value arbitrating mechanism for a majority of the English-speaking world. Indeed, the Booker has maintained a reputation for bringing writers from postcolonial nations to the attention of a British audience increasingly hungry for a global, cosmopolitan literature, especially one easily available via the lingua franca of English. Whether and how the prize winners avoid the twin colonial pitfalls of ownership by and debt to an English patron is the subject of a great deal of criticism on the Booker, and to understand the prize as a gatekeeper and tastemaker for the loose, baggy canon of British or even global Anglophone literature, there must be a reckoning with the history of the prize, its multiplication into several prizes under one umbrella category, and the form and substance of the novels that have taken the prize since 1969.

Article

The Early Black Atlantic Conversion Narrative  

Vincent Carretta

Prior to the last decade of the 20th century, literary critics generally, like Thomas Jefferson before them, dismissed the role religion played in the writings by and about the first generation of English-speaking authors of African descent. Christ’s injunction to his followers to bear witness to their faith, however, gave that first generation the sanction, means, motive, and opportunity to speak truth to power during the 18th-century period of the transatlantic Protestant Great Awakening and Evangelical Revival. The early Black evangelical authors, such as Phillis Wheatley in poetry and Olaudah Equiano in prose, used narratives of their religious conversions as both testaments to their own faith as well as models of spiritual belief and secular behavior for their primarily white readers to follow. The writings of Wheatley and Equiano also exemplify how early Black authors who adopted Christianity could appropriate its tenets to challenge the institution of slavery.

Article

The Eddas and Sagas of Iceland  

Gísli Sigurðsson

The eddas and sagas are literary works written in Iceland in the 13th and 14th centuries but incorporating memories preserved orally from preliterate times of (a) Norse myths, in prose and verse form, (b) heroic lays with common Germanic roots, (c) raiding and trading voyages of the Viking Age (800–1030 CE), and (d) the settlement of Iceland from Norway, Britain, and Ireland starting from the 870s and of life in the new country up to and beyond the conversion to Christianity in the year 1000. In their writing, these works show the influence of the learning and literature introduced to Iceland from the 11th century on through the educational system of the medieval Church. During these centuries, the Icelanders translated the lives of the principal saints, produced saga biographies of their own bishops, and recorded accounts of events and conflicts contemporary with their authors. They also produced conventional chronicles on European models of the kings of Norway and Denmark and large quantities of works, both translated and original, in the spirit of medieval chivalry. The eddas and sagas, however, reflect a unique and original departure that has no direct analogue in mainland Europe—the creation of new works and genres rooted in the secular tradition of oral learning and storytelling. This tradition encompassed the Icelanders’ worldview in the 12th, 13th and 14th centuries and their understanding of events, people, and chronology going back to the 9th century, and their experience of an environment that extended over the parts of the world known to the Norsemen of the Viking Age, both on earth and in heaven. The infrastructure that underlay this system of learning was a knowledge of the regnal years of kings who employed court poets to memorialize their lives, and stories that were told in connection with what people observed in the heavens and on earth, near and far, by linking the stories with individual journeys, dwellings, and the genealogies of the leading protagonists. In this world, people here on earth envisaged the gods as having their halls and dwellings in the sky among the stars and the sun, while beyond the ocean and beneath the furthest horizon lay the world of the giants. In Viking times, this furthest horizon shifted little by little westwards, from the seas around Norway and Britain to the Faroes, Iceland, Greenland, and eventually still farther south and west to previously unknown lands that people in Iceland retained memories of the ancestors having discovered and explored around the year 1000—Helluland, Markland, and Vínland—where they came into contact with the native inhabitants of the continent known as North America.

Article

The Index in the Premodern and Modern World  

Kyle Conrau-Lewis

In the history of the book, indexes emerged as a result of a number of developments in paratexts and organization. The earliest examples of this device varied significantly in layout, organization, and textual form. While various kinds of tables of contents are attested in the ancient world, the index is a much later innovation. The earliest use of indexes is found in legal and then scholastic and patristic texts in continental Europe; they were particularly useful for university students and preachers. Indexes served as aids to help them navigate the growing corpus of legal and theological compilations and commentaries. However, their format and function were variable: the manuscript evidence shows a great degree of experimentation, combining alphabetic, vocalic, and systematic orders of arrangement. In the early modern period, with increasing anxieties about how to organize and manage information, treatises instructed readers how to compile an index. In turn, from the 16th century and well into the 18th, writers cautioned against an excessive reliance on these book aids in lieu of reading the whole books and lampooned so-called “index learning.” The use of indexes in Greek, Hebrew, and Islamic book culture only began in earnest in the early modern period.

Article

The Institutional Turn  

Jeremy Rosen

Since around the turn of the millennium, Anglo-American literary scholarship has been marked by a remarkable shift in its attention to and its attitude toward institutions. Within this shift or “institutional turn,” two interrelated movements can be detected: 1) a departure from thinking about literature as a social institution, toward a sociological approach that examines the many and varied organizations and institutions in and through which literature and its value are produced, distributed, and consumed; and 2) a tendency to revise earlier critiques of institutions, which were often indebted to the work of Michel Foucault, and which emphasized their regulating and disciplinary power, in favor of a more balanced view of institutions as enabling as well as constraining, and in some cases, an outright advocacy for their value and the need to conserve them. Both of these movements stem from scholars’ recognition of the heterogeneity of actual institutions. Rather than understanding literature as something constituted by monolithic, homogenizing forces, early 21st-century literary scholars tend to emphasize the way it is generated and sustained by a wide range of practices occurring in an equally disparate set of institutional locations. Since the early 2000s, scholars have undertaken to analyze the workings of these institutions as the more immediate context in which literary production occurs and is disseminated—a middle range of actors and organizations situated between broader social and historical currents and literary texts. The more charitable attitude toward institutions also recognizes the crucial roles institutions play in the teaching and study of literature. Scholars have thus begun defending the work of institutions, in response to early 21st-century conditions of neoliberalism, under which governments have withdrawn state support for public institutions, including institutions of higher education. A neoliberal ideology that reduces all value to market value presents a threat to institutions that are not primarily dedicated to the generation of economic profit. Thus both of the movements toward institutional study are necessarily bound up with a tradition of scholars who have produced “institutional histories” of literature departments and of the discipline of literary studies. Under neoliberal conditions, such histories have gained urgency, giving rise to a renewed call to account for the value of literary study and of educational institutions in terms that do not reduce this value to service to the economy.

Article

The Matter of Drafts  

Jani Scandura

The presence (or absence) of compositional precursors and leftovers raise for critics and editors methodological, epistemological, ethical, and aesthetic questions: What gets collected and preserved? What does not—for what reasons? How can these materials be interpreted? And to what ends? A draft may refer to written materials that never attain printed form as well as early manuscript compositions and fair copies, typescripts, digital text, scribbles, doodles, leftovers, or other marginalia and extraneous materials that may or may not find their way into archives. The manuscript draft came of age following the invention of printing, although unfinished or working drafts only began to be self-consciously collected with the emergence of the state archive in the late 18th century. The draft is, therefore, intimately connected to the archival, whether the archive is taken as a material site, a discursive structure, or a depository of feeling. Any interpretation of drafts must take into account the limits and limitations of matter including the bare fact of a draft’s material existence or its absence. In the 20th and 21st centuries, there have evolved a diverse network of theoretical approaches to interpreting drafts and compositional materials. Scholars of drafts may ask questions about authorship, materiality, production, technology and media, pedagogy, social norms and conventions, ownership and capital, preservation or destruction, even ethics and ontology. However, these investigations have been most pronounced within four fields: (a) media theory, histories of the book, and historical materialisms that investigate the substance, matter, and means of production of drafts as well as the technological, pedagogical, and social norms that mediate writing, and the cultural/historical specifics of these materials and media; (b) textual editing, which establishes methods that regularize (or complicate) how scholarly editions are produced and related mid-20th century New Bibliography approaches, which illuminated some of the limitations of manuscript-and-edition blind close reading, especially by the New Critics; (c) French genetic criticism in the late 20th and early 21st centuries, which engages with French post-structuralism and psychoanalysis to look at writing as a dynamic and developmental process that has both conscious and unconscious components; and (d) legal scholarship and debates concerning rights to ownership and possession of manuscripts and drafts and their publication, which developed between the 17th and 21st century. These discussions, and their elaboration within national and international legislation, resulted in the invention of copyright, moral rights, and changed understanding of legal rights to privacy and property as well as a division between material and intellectual property, the use and destruction of that property, and the delineation of rights of the dead or the dead’s descendants. The draft manuscript came to be endowed with multiple bodies, both fictive and actual, for which individuals, institutions, corporations, and even nations or the world at large, were granted partial ownership or responsibility. From the late 19th century, the catastrophic legacy of modern warfare and its technologies, including censorship, as well as movements in historical preservation, cultural heritage, and ethics have affected policies regarding ownership and the conservancy of drafts. The emergence of digital and on-line textual production/dissemination/preservation in the late 20th and 21st centuries have broadly transformed the ways that drafts may be attended to and even thought. Drafts must finally be seen to have a complex and intimate relationship to the authorial body and to embodiment, materiality, subjectivity, and writing more generally. Drafts—particularly unread, missing, or destroyed drafts—lie at the border between the dead object and living text. As such, the purposeful destruction of drafts and manuscripts initiates an ontological and ethical crisis that raises questions about the relationship between writing and being, process and product, body and thing.

Article

Theorizing the Subject  

Sidonie Smith

Ever since the Greek philosophers and fabulists pondered the question “What is man?,” inquiries into the concept of the subject have troubled humanists, eventuating in fierce debates and weighty tomes. In the wake of the Descartes’s cogito and Enlightenment thought, proposals for an ontology of the idealist subject’s rationality, autonomy, and individualism generated tenacious questions regarding the condition of pre-consciousness, the operation of feelings and intuitions, the subject-object relation, and the origin of moral and ethical principles. Throughout the 19th and 20th centuries, Marx, and theorists he and Engels influenced, pursued the materialist bases of the subject, through analyses of economic determinism, self-alienation, and false consciousness. Through another lineage, Freud and theorists of psychic structures pursued explanations of the incoherence of a split subject, its multipartite psychodynamics, and its relationship to signifying systems. By the latter 20th century, theorizations of becoming a gendered woman by Beauvoir, of disciplining power and ideological interpellation by Foucault and Althusser, and of structuralist dynamics of the symbolic realm expounded by Lacan, energized a succession of poststructuralist, postmodern, feminist, queer, and new materialist theorists to advance one critique after another of the inherited concept of the liberal subject as individualist, disembodied (Western) Man. In doing so, they elaborated conditions through which subjects are gendered and racialized and offered explanatory frameworks for understanding subjectivity as an effect of positionality within larger formations of patriarchy, slavery, conquest, colonialism, and global neoliberalism. By the early decades of the 21st century, posthumanist theorists dislodged the subject as the center of agentic action and distributed its processual unfolding across trans-species companionship, trans-corporeality, algorithmic networks, and conjunctions of forcefields. Persistently, theorists of the subject referred to an entangled set of related but distinct terms, such as the human, person, self, ego, interiority, and personal identity. And across diverse humanities disciplines, they struggled to define and refine constitutive features of subject formation, most prominently relationality, agency, identity, and embodiment.

Article

Theory of the Novel  

Jesse Rosenthal

Novel theory sets out to explain a set of literary objects that are already fairly familiar to most modern readers. In fact, it is this assumed familiarity—the sense that there is something in the novel form that aligns with the lived experience of modernity—that animates the tradition of novel theory. Instead of seeking to explain one novel, or to narrate a history that includes all novels, theories of the novel tend to describe a certain set of recognizable, usually formal, features that conform to certain notions of modern subjectivity. The result, nearly across the board, is that theories of the novel operate by excluding far more books in the category of “novel” than they include. Although assuming a descriptive rhetoric, they are instead prescriptive, vastly delimiting the field of possible novels into a much smaller, more manageable, group. This is not offered as a critique as much as definition: what separates novel theory from a critique or history. By seeing the tradition of novel theory in terms of its exclusions, we are better able to understand both the larger “novel theory” genre. But we are better able to understand its blind spots too. By focusing on a particular model of European modernity, and centering its formal concerns around realism and the everyday, academic discussions of the novel have often found difficulty in describing non-European experiences, the experiences of historically marginalized populations, and the catastrophic changes brought about by the Anthropocene. Yet this is not so much a shortcoming of the novel form, as some have suggested, but rather a set of possibilities that lies in the negative space of the novel demarcated by previous novel theory. Reading the history of novel theory in terms of its exclusions, then, offers a sense of the future possibilities of the novel form.