201-220 of 231 Results  for:

  • Literary Theory x
Clear all



Daniel Hartley

Modern style emerged from the ruins of the premodern “separation of styles” (high, middle, and low). Whereas, previously, only the nobility could be represented in the high style and commoners in the low, modern style harbors a democratic, generic potential: in principle, anyone can write about anything in any way he or she likes. The history of modern style, as a central critical and compositional principle, is thus deeply imbricated with modern democracy and capitalist modernity. It has a unique relationship to the history of realism, which was itself premised upon the demise of the separation of styles. Many critics (e.g., Erich Auerbach, Roland Barthes, and Fredric Jameson) stress the way in which, as a concept and linguistic practice, style connects the body to a generic, Utopian potential of the everyday. Feminist critics, such as Hélène Cixous and Luce Irigaray, have pursued style’s relationship to the body to delineate a specifically feminine mode of writing [écriture féminine]. Marxist critics, such as Raymond Williams, have argued that style should be understood as a linguistic mode of social relationship. The corollary is that social contradictions are experienced by writers as problems of style (e.g., in Thomas Hardy: how to unite the “educated” style of the urban ruling class with the “customary” style of the rural working class into a single artistic whole). Other critics (e.g., Franco Moretti, Roberto Schwarz) have extended this logic to the scale of “world literature:” they identify stylistic discontinuity as a feature of peripheral world literature that seeks to imitate European realist forms; it is caused by a mismatch between prevailing modes of production and dominant ideologies at the core and the (semi-)periphery of the capitalist world-system. Free indirect style, which merges narrator and character into a new, third voice, has been identified as a key feature of prose fiction in the world-systemic core—the symbolic embodiment of modern, bourgeois forms of power (an “impersonal intimacy”). Finally, “late style”—a concept associated with Theodor W. Adorno and Edward W. Said—has become an influential way of characterizing works of artistic maturity written as the author approaches old age and death (though it is certainly not limited to biological maturity). It is a style in which form and subjectivity become torn from one another, the latter freeing itself only then to subtract itself (rather than “express” itself). Style thus hovers between the impersonality of the demos and the grave.



Ian Balfour

The sublime as an aesthetic category has an extraordinarily discontinuous history in Western criticism and theory, though the phenomena it points to in art and nature are without historical limit, or virtually so. The sublime as a concept and phenomenon is harder to define than many aesthetic concepts, partly because of its content and partly because of the absence of a definition in the first great surviving text on the subject, Longinus’s On the Sublime. The sublime is inflected differently in the major theorists: in Longinus it produces ecstasy or transport in the reader or listener; in Burke its main ingredient is terror (but supplemented by infinity and obscurity); and in Kant’s bifurcated system of the mathematical and dynamic sublime, the former entails a cognitive overload, a breakdown of the imagination, and the ability to represent, whereas in the latter, the subject, after first being threatened, virtually, by powerful nature outside her or him, turns inward to discover a power of reason able to think beyond the realm of the senses. Many theorists testify to the effect of transcendence or exaltation of the self on the far side of a disturbing, disorienting experience that at least momentarily suspends or even annihilates the self. A great deal in the theoretical-critical texts turns on the force of singularly impressive examples, which may or may not exceed the designs of the theoretical axioms they are meant to exemplify. Examples of sublimity are by no means limited to nature and art but spill over into numerous domains of cultural and social life. The singular force of the individual examples, it is argued, nonetheless tends to work out similarly in certain genres conducive to the sublime (epic, tragedy) but somewhat differently from one genre to another. The heyday of the theory and critical engagement with the sublime lasts, in Western Europe and a little beyond, from the late 17th century to the early 19th century. But it does not simply go away, with sublime aesthetic production and critical reflection on the sublime present in the likes of Baudelaire, Nietzsche, and—to Adorno’s mind—in the art of modernism generally, in its critical swerve from the canons of what had counted as beauty. The sublime flourished as a topic in theory of criticism of the poststructuralist era, in figures such as Lyotard and Paul de Man but also in Fredric Jameson’s analysis of the cultural logic of late capitalism. The then current drive to critique the principle and some protocols of representation found an almost tailor-made topic in Enlightenment and Romantic theory of the sublime where, within philosophy, representation had been rendered problematic in robust fashion.



Shiamin Kwa

Thinking about surface and its historiography in the early 21st century is a way of thinking about ways of seeing in the world, and how people define themselves in relation to the things around them. From literary texts to the decorative arts, from graphic narratives to digital stories, and from film to the textile arts, the ways of reading those texts frequently raise questions about interactions with surfaces. Theories of surface have been engaged in many ways since their invocation by French theorists in the final decades of the twentieth century. They have a steady but by no means identical presence in the field of visual studies, history of architecture, and film studies; they have found an application in discussions of race and identity; they have enjoyed an early 21st century turn in the spotlight under the auspices of a broadly defined call for a “surface reading.” This critical move defines surface as worthy of scrutiny in its own right, rather than as something that needs to be “seen through,” and makes its most profound claims less by reactivating attention to reading surfaces, which arguably has been done all along, but by a shifting away from a model of interpretation that makes claims for authoritative symptomatic readings by an all-knowing interpreter.


Sympathy and Empathy  

Rae Greiner

Sympathy and empathy are complex and entwined concepts with philosophical and scientific roots relating to issues in ethics, aesthetics, psychology, biology, and neuroscience. For some, the two concepts are indistinguishable, the two terms interchangeable, but each has a unique history as well as qualities that make both concepts distinct. Although each is associated with feeling, especially the capacity to feel with others or to imaginatively put oneself “in their shoes,” the concepts’ sometimes shared, sometimes divergent histories reveal more complicated origins, as well as vexed and ongoing relations to feeling and emotion and to the ethical value of emotional sharing. Though empathy regularly is considered the more advanced and egalitarian of the two, it shares with sympathy a controversial role in historical debates regarding questions of an inborn or divine moral sense, prosocial behavior and the development of human communities, the relation of sensation to unconscious mental processes, brain matter, and neurons, and animal/human difference. In literary criticism, sympathy and empathy have been key components of aesthetic movements such as sentimentalism, realism, and modernism, and of literary techniques like free indirect discourse (FID), which are thought (by some) to enhance readerly intimacy and closeness to novelistic characters and perspectives. Both concepts have also received their fair share of suspicion, as the capacity to feel, or imagine feeling, the emotions of others remains a controversial basis for ethics.



Eleonora Lima

The history of literature has always been influenced by technological progress, as a transformative cultural power—threatening destruction or promising a luminous future—as a theme inspiring new narrative forms and plots, or as a force influencing the way authors conceive textuality and perform their creative work. The entanglement between literary and technological inventions is even recorded in the etymology of the word, which comes from the Greek “techne,” a term referring to arts as well as crafts. The way writers conceive this relationship, however, varies greatly: although some consider the work of technicians to be congenial to artistic creation, as they both demonstrate human creativity and ingenuity, others believe technology to be a dehumanizing and unnatural force, not only alien to literature but in competition with its own ethos. Therefore, depending on their position, the writer comes to embody the mythical figure of Prometheus, the first technician and defiant creator, or that of Orpheus, symbolizing the marriage between poetry and nature compared to any artificial creation. However, the opposition between nature and technology, with literature positioning itself either in one realm or the other, is only one of many possible critical perspectives. Indeed, when moving beyond the idea of technology as merely a kind of artifact, the affinities between texts and machines clearly emerge. A mutual relation connects technology and textuality, and this has to do with the complex nature of material and cultural objects, each shaped by social use, aesthetic norms, and power structures. This bond between discursivity and materiality is impossible to disentangle, as is the contextual relationship between literature and technology: Texts prescribe meanings to machines just as much as the latter shape their textuality. To recognize literature and technology as two different systems of meanings and sets of practices which are nevertheless always in conversation with each other is also to understand literature as technology. This stance has nothing to do with the likeness of the poet and the technician as creative minds but rather with the idea of literary texts functioning like technologies and, ultimately, offering a meta-reflexive analysis of their own textuality. According to this critical perspective, literature performatively enacts the changes in textuality brought about by technological progress, from the printing press to digital writing tools.



Ian James

Tekhne, or techne, is derived from the Greek term technê, meaning art, craft, technique, or skill, and plays an important role in Ancient Greek philosophy (in, for instance, Xenophon, Plato, Aristotle) where it is most often opposed to epistêmê, meaning knowledge. The legacy of the various Greek philosophical negotiations with, and distinctions between, technê and epistêmê leave a lasting mark on European thought and knowledge from the medieval period through to the early modern period and into modern philosophy from Emmanuel Kant onwards up to and including 20th-century phenomenology (Edmund Husserl, Martin Heidegger) and its subsequent legacy, particularly in French philosophy. So, for instance, in Plato’s Protagoras, the myth of Epimetheus and Prometheus describes the latter’s theft of the technê of fire as a result of the former’s forgetfulness with regard to the bestowal of attributes to human beings. Here technê emerges as skill or technique but also as a more general founding moment of humankind’s technical and technological capacities. In The Republic Plato opposes the knowledge of reality and truth (of ideal forms) to the representational status of dramatic poetry (as a technê poietike or productive technique) and by extension to arts and literature in general. In this context the latter have a degraded status in relation to knowledge or truth, and this sets the stage for attempts that will be made by later philosophy to distance itself from aesthetic form or literary discourse. In Aristotle technê emerges within the distinction between art as productive technique and theoretical knowledge on the one hand (theoria) and action on the other (praxis). Aristotle’s distinctions have an influential afterlife in the medieval period and into the early modern, in particular in Emmanuel Kant’s definition of art as a skill or capacity for the production of things. The legacy of this long negotiation of Greek technê as art, productive technique, technical skill, or technology finds its way into 20th-century German phenomenology; in Edmund Husserl’s account of the rise of the scientific worldview and instrumental rationality in The Crisis of European Sciences and Transcendental Phenomenology (1938) and in Martin Heidegger’s discourse on technological modernity, art, and the philosophical-poetic saying of being as it is developed from the 1930s onwards. The legacy of German phenomenological thinking relating to tekhne, understood as a fundamental dimension of both artistic and technological production, has a particularly strong afterlife in post–World War II French structuralism, poststructuralism, and contemporary philosophy. The influence of Husserl’s understanding of technicity can be traced directly in various ways into the work of, for instance, Jean-François Lyotard, Michel Foucault, and Jacques Derrida. Similarly, both Husserlian and Heideggerian discourse on tekhne find their way in the thinking of technology, ecotechnicity, and technics of contemporary philosophers such as Jean-Luc Nancy. Nancy’s discourse on the technicity of art yields an affirmation of the irreducible plurality of aesthetic techniques and, in particular, a reorientation of possible ways of understanding the place of literature in the age of digital information technology.



Theodore Martin

Time is not a strictly literary category, yet literature is unthinkable without time. The events of a story unfold over time. The narration of that story imposes a separate order of time (chronological, discontinuous, in medias res). The reading of that narrative may take its own sweet time. Then there is the fact that literature itself exists in time. Transmitted across generations, literary texts cannot help but remind us of how times have changed. In doing so, they also show us how prior historical moments were indelibly shaped by their own specific philosophies and technologies of timekeeping—from the forms of sacred time that informed medieval writing; to the clash between national time and natural history that preoccupied the Romantics; to the technological standardization of time that shaped 19th-century literature; to the theories of psychological time that emerged in tandem with modernism; to the fragmented and foreshortened digital times that underlie postmodern fiction. Time, in short, shapes literature several times over: from reading experience to narrative form to cultural context. In this way, literature can be read as a peculiarly sensitive timepiece of its own, both reflecting and responding to the complex and varied history of shared time. Over the course of the 20th century, literary time has become an increasingly prominent issue for literary critics. Time was first installed at the heart of literary criticism by way of narrative theory and narratology, which sought to explain narrative’s irreducibly temporal structure. Soon, though, formalist and phenomenological approaches to time would give way to more historically and politically attuned methods, which have emphasized modern time’s enmeshment in imperialism, industrial capitalism, and globalization. In today’s critical landscape, time is a crucial and contested topic in a wide range of subfields, offering us indispensable insights into the history and ideology of modernity; the temporal politics of nationalism, colonialism, and racial oppression; the alternate timescales of environmental crisis and geological change; and the transformations of life and work that structure postmodern and postindustrial society.



Rossana De Angelis

The concept of “text” is ambiguous: it can identify at the same time a concrete reality and an abstract one. Indeed, text presents itself both as an empirical object subject to analysis and an abstract object constructed by the analysis itself. This duplicity characterizes the development of the concept in the 20th century. According to different theories of language, there are also different understandings of “text”: a restricted use as written text, an extensive use as written and spoken text, and an expanded use as any written, verbal, gestural, or visual manifestation. The concept of “text” also presupposes two other concepts: from a generative point of view, it involves a proceeding by which something becomes a text (textualization); from an interpretative point of view, it involves a proceeding by which something can be interpreted as a text (textuality). In textual linguistics, “text” is considered at the same time as an abstract object, issued from a specific theoretical approach, and a concrete object, a linguistic phenomenon starting the process of analysis. In textual linguistics, textuality presents as a global quality of text issued from the interlacing of the sentences composing it. In linguistics, the definition of textuality depends on the definition of text. For instance, M. A. K. Halliday and Ruqaiya Hasan define textuality through the concepts of “cohesion” and “coherence.” Cohesion is a necessary condition of textuality, because it enables text to be perceived as a whole, but it’s not sufficient to explain it. In fact, to be interpreted as a whole, the elements composing the text need to be coherent to each other. But according to Robert-Alain De Beaugrande and Wolfgang Ulrich Dressler, cohesion and coherence are only two of the seven principles of textuality (the other five being intentionality, acceptability, informativity, situationality, and intertextuality). Textual pragmatics deals with a more complex problem: that of the text conceived as an empirical object. Here the text is presented as a unit captured in a communication process, “a communicative unit.” Considered from a pragmatic point of view, every single unit composing a text constitutes an instruction for meaning. Since the 1970s, analyzing connections between texts and contexts, textual pragmatics, has been an important source of inspiration for textual semiotics. In semiotics, the theory of language proposed by Louis T. Hjelmslev, the concept of “text” is conceived above all as a process and a “relational hierarchy.” Furthermore, according to Hjelmslev, textuality consists in the idea of “mutual dependencies,” composing a whole which makes the text an “absolute totality” to be interpreted by readers and analyzed by linguists. Since texts are composed of a network of connections at both local and global levels, their analyses depend on the possibility to reconstruct the relation between global and local dimensions. For this reason, François Rastier suggests that in order to capture the meaning of a text, the semantic analysis must identify semantic forms at different semantic levels. So textuality comes from the articulation between the semantic and phemic forms (content and expression), and from the semantic and phemic roots from which the forms emerge. Textuality allows the reader to identify the interpretative paths through which to understand the text. This complex dynamic is at the foundation of this idea of textuality. Now that digital texts are available, researchers have developed several methods and tools to exploit such digital texts and discourse, representing at the same time different approaches to meaning. Text Mining is based on a simple principle: the identification and processing of textual contents to extract knowledge. By using digital tools, the intra-textual and inter-textual links can be visualized on the screen, as lists or tables of results, which permits the analysis of the occurrences and frequency of certain textual elements composing the digital texts. So, another idea of text is visible to the linguist: not the classical one according to the culture of printed texts, but a new one typical of the culture of digital texts, and their textuality.


Textual Studies  

Mark Byron

Textual studies describes a range of fields and methodologies that evaluate how texts are constituted both physically and conceptually, document how they are preserved, copied, and circulated, and propose ways in which they might be edited to minimize error and maximize the text’s integrity. The vast temporal reach of the history of textuality—from oral traditions spanning thousands of years and written forms dating from the 4th millenium bce to printed and digital text forms—is matched by its geographical range covering every linguistic community around the globe. Methods of evaluating material text-bearing documents and the reliability of their written or printed content stem from antiquity, often paying closest attention to sacred texts as well as to legal documents and literary works that helped form linguistic and social group identity. With the incarnation of the printing press in the early modern West, the rapid reproduction of text matter in large quantities had the effect of corrupting many texts with printing errors as well as providing the technical means of correcting such errors more cheaply and quickly than in the preceding scribal culture. From the 18th century, techniques of textual criticism were developed to attempt systematic correction of textual error, again with an emphasis on scriptural and classical texts. This “golden age of philology” slowly widened its range to consider such foundational medieval texts as Dante’s Commedia as well as, in time, modern vernacular literature. The technique of stemmatic analysis—the establishment of family relationships between existing documents of a text—provided the means for scholars to choose between copies of a work in the pursuit of accuracy. In the absence of original documents (manuscripts in the hand of Aristotle or the four Evangelists, for example) the choice between existing versions of a text were often made eclectically—that is, drawing on multiple versions—and thus were subject to such considerations as the historic range and geographical diffusion of documents, the systematic identification of common scribal errors, and matters of translation. As the study of modern languages and literatures consolidated into modern university departments in the later 19th century, new techniques emerged with the aim of providing reliable literary texts free from obvious error. This aim had in common with the preceding philological tradition the belief that what a text means—discovered in the practice of hermeneutics—was contingent on what the text states—established by an accurate textual record that eliminates error by means of textual criticism. The methods of textual criticism took several paths through the 20th century: the Anglophone tradition centered on editing Shakespeare’s works by drawing on the earliest available documents—the printed Quartos and Folios—developing into the Greg–Bowers–Tanselle copy-text “tradition” which was then deployed as a method by which to edit later texts. The status of variants in modern literary works with multiple authorial manuscripts—not to mention the existence of competing versions of several of Shakespeare’s plays—complicated matters sufficiently that editors looked to alternate editorial models. Genetic editorial methods draw in part on German editorial techniques, collating all existing manuscripts and printed texts of a work in order to provide a record of its composition process, including epigenetic processes following publication. The French methods of critique génétique also place the documentary record at the center, where the dossier is given priority over any one printed edition, and poststructuralist theory is used to examine the process of “textual invention.” The inherently social aspects of textual production—the author’s interaction with agents, censors, publishers, and printers and the way these interactions shape the content and presentation of the text—have reconceived how textual authority and variation are understood in the social and economic contexts of publication. And, finally, the advent of digital publication platforms has given rise to new developments in the presentation of textual editions and manuscript documents, displacing copy-text editing in some fields such as modernism studies in favor of genetic or synoptic models of composition and textual production.


19th-Century Spirit Photography  

Cheryl Spinner

Spirit photography emerges out of the widespread movement of Spiritualism in the 19th century. In 1848, the Fox sisters of upstate New York claimed that the mysterious knockings emanating from the walls of their farmhouse represented the opening of a spirit telegraph that faciliated communication between the world of the living and the world of the dead. Spiritualism quickly became a techno-religious movement closely aligned with the abolitionist and suffragist movements. The movement utilized burgeoning technologies to apply a scientific rigor to phenomena beyond the five human senses. The photochemical process and the swift advancement of photography as both an art and science were particularly powerful mediums for providing evidence that spirits can manifest in the visible world. Sir John Herschel coined the term “photography” by combining the Greek words photos and graphé, literally “light writing” or “writing by light.” The term itself advances the concept that the camera produced an unmediated reproduction of the natural world, and, with the first spirit photograph emerging in 1862, believers understood that the camera was both capturing spirits of the dead and scientifically proving that the spirits were real. Nineteenth-century debates about the veracity of these images pivoted on the question of what photography was capable of capturing. Scientists knew that photography could capture invisible fluorescence, and Spiritualists argued that if the camera could capture the invisible world, then it could also capture spirits.


The Institutional Turn  

Jeremy Rosen

Since around the turn of the millennium, Anglo-American literary scholarship has been marked by a remarkable shift in its attention to and its attitude toward institutions. Within this shift or “institutional turn,” two interrelated movements can be detected: 1) a departure from thinking about literature as a social institution, toward a sociological approach that examines the many and varied organizations and institutions in and through which literature and its value are produced, distributed, and consumed; and 2) a tendency to revise earlier critiques of institutions, which were often indebted to the work of Michel Foucault, and which emphasized their regulating and disciplinary power, in favor of a more balanced view of institutions as enabling as well as constraining, and in some cases, an outright advocacy for their value and the need to conserve them. Both of these movements stem from scholars’ recognition of the heterogeneity of actual institutions. Rather than understanding literature as something constituted by monolithic, homogenizing forces, early 21st-century literary scholars tend to emphasize the way it is generated and sustained by a wide range of practices occurring in an equally disparate set of institutional locations. Since the early 2000s, scholars have undertaken to analyze the workings of these institutions as the more immediate context in which literary production occurs and is disseminated—a middle range of actors and organizations situated between broader social and historical currents and literary texts. The more charitable attitude toward institutions also recognizes the crucial roles institutions play in the teaching and study of literature. Scholars have thus begun defending the work of institutions, in response to early 21st-century conditions of neoliberalism, under which governments have withdrawn state support for public institutions, including institutions of higher education. A neoliberal ideology that reduces all value to market value presents a threat to institutions that are not primarily dedicated to the generation of economic profit. Thus both of the movements toward institutional study are necessarily bound up with a tradition of scholars who have produced “institutional histories” of literature departments and of the discipline of literary studies. Under neoliberal conditions, such histories have gained urgency, giving rise to a renewed call to account for the value of literary study and of educational institutions in terms that do not reduce this value to service to the economy.


The Matter of Drafts  

Jani Scandura

The presence (or absence) of compositional precursors and leftovers raise for critics and editors methodological, epistemological, ethical, and aesthetic questions: What gets collected and preserved? What does not—for what reasons? How can these materials be interpreted? And to what ends? A draft may refer to written materials that never attain printed form as well as early manuscript compositions and fair copies, typescripts, digital text, scribbles, doodles, leftovers, or other marginalia and extraneous materials that may or may not find their way into archives. The manuscript draft came of age following the invention of printing, although unfinished or working drafts only began to be self-consciously collected with the emergence of the state archive in the late 18th century. The draft is, therefore, intimately connected to the archival, whether the archive is taken as a material site, a discursive structure, or a depository of feeling. Any interpretation of drafts must take into account the limits and limitations of matter including the bare fact of a draft’s material existence or its absence. In the 20th and 21st centuries, there have evolved a diverse network of theoretical approaches to interpreting drafts and compositional materials. Scholars of drafts may ask questions about authorship, materiality, production, technology and media, pedagogy, social norms and conventions, ownership and capital, preservation or destruction, even ethics and ontology. However, these investigations have been most pronounced within four fields: (a) media theory, histories of the book, and historical materialisms that investigate the substance, matter, and means of production of drafts as well as the technological, pedagogical, and social norms that mediate writing, and the cultural/historical specifics of these materials and media; (b) textual editing, which establishes methods that regularize (or complicate) how scholarly editions are produced and related mid-20th century New Bibliography approaches, which illuminated some of the limitations of manuscript-and-edition blind close reading, especially by the New Critics; (c) French genetic criticism in the late 20th and early 21st centuries, which engages with French post-structuralism and psychoanalysis to look at writing as a dynamic and developmental process that has both conscious and unconscious components; and (d) legal scholarship and debates concerning rights to ownership and possession of manuscripts and drafts and their publication, which developed between the 17th and 21st century. These discussions, and their elaboration within national and international legislation, resulted in the invention of copyright, moral rights, and changed understanding of legal rights to privacy and property as well as a division between material and intellectual property, the use and destruction of that property, and the delineation of rights of the dead or the dead’s descendants. The draft manuscript came to be endowed with multiple bodies, both fictive and actual, for which individuals, institutions, corporations, and even nations or the world at large, were granted partial ownership or responsibility. From the late 19th century, the catastrophic legacy of modern warfare and its technologies, including censorship, as well as movements in historical preservation, cultural heritage, and ethics have affected policies regarding ownership and the conservancy of drafts. The emergence of digital and on-line textual production/dissemination/preservation in the late 20th and 21st centuries have broadly transformed the ways that drafts may be attended to and even thought. Drafts must finally be seen to have a complex and intimate relationship to the authorial body and to embodiment, materiality, subjectivity, and writing more generally. Drafts—particularly unread, missing, or destroyed drafts—lie at the border between the dead object and living text. As such, the purposeful destruction of drafts and manuscripts initiates an ontological and ethical crisis that raises questions about the relationship between writing and being, process and product, body and thing.


Theorizing the Subject  

Sidonie Smith

Ever since the Greek philosophers and fabulists pondered the question “What is man?,” inquiries into the concept of the subject have troubled humanists, eventuating in fierce debates and weighty tomes. In the wake of the Descartes’s cogito and Enlightenment thought, proposals for an ontology of the idealist subject’s rationality, autonomy, and individualism generated tenacious questions regarding the condition of pre-consciousness, the operation of feelings and intuitions, the subject-object relation, and the origin of moral and ethical principles. Throughout the 19th and 20th centuries, Marx, and theorists he and Engels influenced, pursued the materialist bases of the subject, through analyses of economic determinism, self-alienation, and false consciousness. Through another lineage, Freud and theorists of psychic structures pursued explanations of the incoherence of a split subject, its multipartite psychodynamics, and its relationship to signifying systems. By the latter 20th century, theorizations of becoming a gendered woman by Beauvoir, of disciplining power and ideological interpellation by Foucault and Althusser, and of structuralist dynamics of the symbolic realm expounded by Lacan, energized a succession of poststructuralist, postmodern, feminist, queer, and new materialist theorists to advance one critique after another of the inherited concept of the liberal subject as individualist, disembodied (Western) Man. In doing so, they elaborated conditions through which subjects are gendered and racialized and offered explanatory frameworks for understanding subjectivity as an effect of positionality within larger formations of patriarchy, slavery, conquest, colonialism, and global neoliberalism. By the early decades of the 21st century, posthumanist theorists dislodged the subject as the center of agentic action and distributed its processual unfolding across trans-species companionship, trans-corporeality, algorithmic networks, and conjunctions of forcefields. Persistently, theorists of the subject referred to an entangled set of related but distinct terms, such as the human, person, self, ego, interiority, and personal identity. And across diverse humanities disciplines, they struggled to define and refine constitutive features of subject formation, most prominently relationality, agency, identity, and embodiment.


Theory of the Novel  

Jesse Rosenthal

Novel theory sets out to explain a set of literary objects that are already fairly familiar to most modern readers. In fact, it is this assumed familiarity—the sense that there is something in the novel form that aligns with the lived experience of modernity—that animates the tradition of novel theory. Instead of seeking to explain one novel, or to narrate a history that includes all novels, theories of the novel tend to describe a certain set of recognizable, usually formal, features that conform to certain notions of modern subjectivity. The result, nearly across the board, is that theories of the novel operate by excluding far more books in the category of “novel” than they include. Although assuming a descriptive rhetoric, they are instead prescriptive, vastly delimiting the field of possible novels into a much smaller, more manageable, group. This is not offered as a critique as much as definition: what separates novel theory from a critique or history. By seeing the tradition of novel theory in terms of its exclusions, we are better able to understand both the larger “novel theory” genre. But we are better able to understand its blind spots too. By focusing on a particular model of European modernity, and centering its formal concerns around realism and the everyday, academic discussions of the novel have often found difficulty in describing non-European experiences, the experiences of historically marginalized populations, and the catastrophic changes brought about by the Anthropocene. Yet this is not so much a shortcoming of the novel form, as some have suggested, but rather a set of possibilities that lies in the negative space of the novel demarcated by previous novel theory. Reading the history of novel theory in terms of its exclusions, then, offers a sense of the future possibilities of the novel form.



Woosung Kang

Thing is a categorically indeterminate and comprehensive concept that cannot easily be pinned down to any single or specific meaning. It has a long history of heterogeneous significations, from material objects, through legal issues, to supersensible noumena. For modern philosophies of subjectivity, things are reducible to that which is available for human thinking and acting. Things are represented as objects for the subject in the form of presence-at-hand, and this representational relationship forms the basic structure of the world in modernity. Under the capitalist system of commodity exchanges, moreover, this anthropocentric relationship with things undergoes what is called reification or fetishism, which turns all things human into relations between objects. The objectification of things makes it possible for humans to dominate the world, but fetishism in turn dominates human beings as mere objects. Heidegger’s attempt to deconstruct this objectification reverberates with the Marxist critique of capitalist commodification, and in literature, with the modernist endeavor to overcome reification. These efforts to secure the thingness of the thing are linked to the early 21st century’s efforts to re-establish non-humanistic relations with things and the world. Recently, under the banner of an “ontological turn,” there has been an explosion of interest in things, motivated in particular by growing concerns about anthropocentrism. Indeed, in the face of unprecedented technological change and hyper-digitalization, a new relation between human and nonhuman is desperately required. New ontologies thus try to build a non-hierarchal, object-oriented, monistic universe of hybrids, quasi-objects, and assemblages, such that human beings become only a part of the parliament of things.



Alberto Toscano

From Plato’s Republic and Aristotle’s Poetics onward, tragedy has loomed large in the genealogy of literary theory. But this prominence is in many regards paradoxical. The original object of that theory, the Attic tragedies performed at the Dionysian festivals in 5th- century bce Athens, are, notwithstanding their ubiquitous representation on the modern stage, only a small fraction of the tragedies produced in Athens, and are themselves torn from their context of performance. The Poetics and the plays that served as its objects of analysis would long vanish from the purview of European culture. Yet, when they returned in the Renaissance as cultural monuments to be appropriated and repeated, it was in a context largely incommensurable with their existence in Ancient Greece. While the early moderns created their own poetics (and politics) of tragedy and enlisted their image of the Ancients in the invention of exquisitely modern literary and artistic forms (not least, opera), it was in the crucible of German Idealism and Romanticism, arguably the matrix of modern literary theory, that certain Ancient Greek tragedies were transmuted into models of “the tragic,” an idea that played a formative part in the emergence of philosophical modernity, accompanying a battle of the giants between dialectical (Hegelian) and antidialectical (Nietzschean) currents that continues to shape our theoretical present. The gap between a philosophy of the tragic and the poetics and history of tragedy as a dramatic genre is the site of much rich and provocative debate, in which the definition of literary theory itself is frequently at stake. Tragedy is in this sense usefully defined as a genre in conflict. It is also a genre of conflict, in the sense that ethical conflicts, historical transitions, and political revolutions have all come to define its literary forms, something that is particularly evident in the place of both tragedy and the tragic in the dramas of decolonization.



Quinn Eades

Emerging from feminist and queer theory, trans theory asks us to challenge essentialist and heteronormative understandings of gender, sex, and sexuality. Trans theory teaches us to critique essentialist and binary models of embodiment by attending to and centering the body in theory and in the world. In the early 21st century, trans people are more visible than we have ever been. There is an increasing appetite from “mainstream” readers for trans memoir, larger numbers of trans characters on screen and in the media, and out trans people now hold high-ranking political positions, teach in schools and universities, and act on stage and screen. Rather than the demand for trans stories being driven by scopophilia, curiosity, or voyeurism, it appears that there is a desire to genuinely understand trans lives, bodies, and lived experiences. Visibility comes with a price though, and we must be wary of tracing a simplistic progress narrative in relation to trans and gender diverse people and communities. When we appear in public, we gather our own communities, as well as allies and sympathizers, but these appearances also make us vulnerable to those who still fiercely deny our right to exist—the Vatican City’s thirty-one page statement discussing gender theory in education (2019), where we are told that trans people are “annihilating nature,” is a perfect example of this. While the term “trans” (more often than not) refers to transgender people, it is also a prefix that means “across”; trans denotes movement, going from one to the other, and change. Because we can find trans people across all times, places, and populations, we can also trace a complex, rich, and ever-expanding archive of trans writing, histories, and stories. It is through troubling the idea that trans people are a “modern” invention, that we are the living embodiment of political correctness gone mad, that we can begin to find each other in text, gather together, and work toward making significant social, political, and cultural change.


Transcolonial Studies  

Olivia C. Harrison

Since the beginning of the 21st century, scholars of race and empire have been invested in exploring the horizontal vectors that stretch across and between imperial formations, displacing the vertical axis of North-South relations taken to be characteristic of early postcolonial theory. An analytical framework that seeks to capture the relationality of empire and the transversal modes of resistance against it, transcolonial studies offers a methodology for apprehending the coloniality of the present. The term transcolonial was coined in the 1990s, but the horizontal relationalities it describes are as old as empire itself. Europe’s colonial ventures were relational from the start, driven by competition for hegemony over seas and land and modeled on the likeness of empires past and present. Likewise, resistance to colonial conquest and governance took shape in relation to liberation struggles elsewhere and drew inspiration from previous and ongoing revolts in Haiti, Algeria, Vietnam, and Palestine. The movements for racial justice and decolonization that have followed in the wake of empire are similarly rooted in practices of solidarity that span subject positions without conflating them, from Standing Rock to Gaza and Black Lives Matter. Such unexpected solidarities among heterogeneously racialized and colonized subjects and their majoritarian allies work to undo the reified identities produced in colonial and racial discourse, undermining the competitive identitarian model inaugurated by the divide-and-conquer methods of high colonialism. To describe these alliances as transcolonial is also to acknowledge that Euro-colonial modernity continues to shape the purportedly postcolonial present. The prefix trans is temporal as much as it is geographic and political.


Transgender Studies and Latina/o/x Studies  

Francisco J. Galarte

The field of Latina/o/x studies has long been interested in various forms of gender and sexual deviance and diversity as a site of inquiry. Yet, there are many gaps in the literature of the field when it comes to the study of trans subjectivities, politics, and cultural formations. Foundational theoretical works such as Sandy Stone’s “A Posttransexual Manifesto” (1991) and Gloria Anzaldúa’s Borderlands (1987) share a theoretical approach to understanding autoethnographic texts that propose to write minoritarian subjects into discourse. The result of the two works is the emergence of the “new mestiza” and the “posttranssexual,” two figures that come to shape the fields of transgender, Chicana/o/x, and Latinx studies, respectively. There are myriad ways in which the fields of transgender studies and Latinx studies overlap and depart from each other. Most often, transgender studies is characterized as not grappling directly with race, colonialism, and imperialism, while Latina/o/x studies can at times be read as treating transgender subjects as objects, or sites of inquiry. Therefore, there is much to be gleaned from exploring how the two fields might come into contact with each other, as each becomes increasingly institutionalized.


Translation and Translation as a Weapon  

Gisèle Sapiro

Translation is a social activity that fulfills other functions than mere communication: political, economic and cultural. Thus translation can be used as a political weapon to export or import texts conveying an ideological message, such as socialist realism. As evidenced by the promotion of world bestsellers, translation may in other cases serve economic interests. Literary translations also serve cultural purposes, such as the building of collective (national, social, gendered) identities, the representations of other cultures, or the subversion of the dominant norms in a literary field (as defined by Pierre Bourdieu), which can be illustrated by the reception and uses of William Faulkner’s novels in France in the 1930s (namely by Jean-Paul Sartre). The study of translation has become a research field called “Translation Studies,” which underwent a “sociological turn” at the beginning of the 21st century, and was also renewed at the same time by the rise of “world literature” studies in comparative literature. While translation studies are interested in norms of translation (as defined by Gideon Toury), which may vary across cultures, especially between domesticating and foreignizing strategies, the sociology of translation and of (world) literature asks how literary texts circulate across cultures: who are the mediators? Why do they select certain texts and not others? What obstacles stand in the way of the transfer process? How are translations used as weapons in cultural struggles? The circulation of texts in translation can be studied through a quantitative analysis of flows of translation (across languages, countries, publishing houses) and through qualitative methods: interviews with specialized intermediaries and cultural mediators (publishers, translators, state representatives, literary critics), ethnographic observation (of book fairs, literature festivals), documentary sources (critical reception), archives (of publishers), and text analysis. However, internal (text analysis) and external (sociological) approaches still wait to be fully connected.