Questions of authorship bring into play many of the central questions of literary theory: questions as to what constitutes the unity and coherence of texts, the interpretive relevance of authorial intention, the relation of oral to literate cultures, the regulation of writing by church and state, the legal underpinnings of literary property, the significance of forgery and plagiarism, and so on. At the heart of many of these questions is a distinction between two different orders of phenomena. Writers are not necessarily authors: authorship requires recognition and attribution, and these depend on institutional processes of publication, textual stabilization, criticism, education, and appropriate legal, regulatory, and economic conditions. Those processes and conditions vary from culture to culture, as do the particular historical forms that authorship takes. In the contemporary world authorship tends to be cast as though it were directly expressive of a personality, an inner core of selfhood, that underwrites the coherence of the texts attributed to it; the commercialization of that form gives rise to a cult of the author in both academic and popular culture.
1-10 of 779 Results
The term daemonic—often substantivized in German as the daemonic (das Dämonische) since its use by Johann Wolfgang von Goethe in the early 19th century—is a literary topos associated with divine inspiration and the idea of genius, with the nexus between character and fate and, in more orthodox Christian manifestations, with moral transgression and evil. Although strictly modern literary uses of the term have become prominent only since Goethe, its origins lie in the classical idea of the δαíμων, transliterated into English as daimon or daemon, as an intermediary between the earthly and the divine. This notion can be found in pre-Socratic thinkers such as Empedocles and Heraclitus, in Plato, and in various Stoic and Neo-Platonic sources. One influential aspect of Plato’s presentation of the daemonic is found in Socrates’s daimonion: a divine sign, voice, or hint that dissuades Socrates from taking certain actions at crucial moments in his life. Another is the notion that every soul contains an element of divinity—known as its daimon—that leads it toward heavenly truth. Already in Roman thought, this idea of an external voice or sign begins to be associated with an internal genius that belongs to the individual. In Christian thinking of the European romantic period, the daemonic in general and the Socratic daimonion in particular are associated with notions such as non-rational divine inspiration (for example, in Johann Georg Hamann and Johann Gottfried Herder) and with divine providence (for example, in Joseph Priestley). At the same time, the daemonic is also often interpreted as evil or Satanic—that is: as demonic—by European authors writing in a Christian context. In Russia in particular, during a period spanning from the mid-19th century until the early 20th century, there is a rich vein of novels, including works by Gogol and Dostoevsky, that deal with this more strictly Christian sense of the demonic, especially the notion that the author/narrator may be a heretical figure who supplants the primacy of God’s creation. But the main focus of this article is the more richly ambivalent notion of the daemonic, which explicitly combines both the Greco-Roman and Judeo-Christian heritages of the term. This topos is most prominently mobilized by two literary exponents during the 19th century: Goethe, especially in his autobiography Dichtung und Wahrheit (Poetry and Truth), and Samuel Taylor Coleridge, in his Notebooks and in the Lectures on the History of Philosophy. Both Goethe’s and Coleridge’s treatments of the term, alongside its classical and Judeo-Christian heritages, exerted an influence upon literary theory of the 20th century, leading important theorists such as Georg Lukács, Walter Benjamin, Hans Blumenberg, Angus Fletcher, and Harold Bloom to associate the daemonic with questions concerning the novel, myth, irony, allegory, and literary influence.
An enumeration of generic qualities will define epic less helpfully than will an assessment of its behaviors. Among major literary kinds, epic offers the most long-standing and globally distributed evidence of the human habit of thinking by means of narrative. What it cherishes is the common good; what it ponders are the behaviors and values that forward or threaten collective welfare. What it reckons are the stakes of heroic risk that any living culture must hazard in order to prosper, by negotiating core identities with margins and adjusting settled customs to emergent opportunities; and it roots all these in the transmission of a tale that commands perennial attention on their account. Such dialectics underlie epic’s favorite narrative templates, the master plots of strife, quest, and foundation; and they find expression in such conventions as the in medias res opening and suspended closure; the epic invocation, ancestral underworld, superhuman machinery, and encyclopedic simile; the genre’s formal gravitation towards verse artifice and the lexical and syntactic mingling of old with new language. The genre steadfastly highlights the human condition and prospect, defining these along a scale of higher and lower being, along a timeline correlating history with prophecy, and along cultural coordinates where the familiar and the exotic take each other’s measure.
The concept of “heteroglossia” was coined by Mikhail Bakhtin in an essay from the 1930s. Heteroglossia was the name he gave for the “inner stratification of a single national language into social dialects, group mannerisms, professional jargons, generic languages, the languages of generations and age-groups,” and so on, but it was not simply another term for the linguistic variation studied in sociolinguistics and dialectology. It differed in three respects. First, in heteroglossia differences of linguistic form coincided with differences in social significance and ideology: heteroglossia was stratification into “socio-ideological languages,” which were “specific points of view on the world, forms for its verbal interpretation.” Second, heteroglossia embodied the force of what Bakhtin called “historical becoming.” In embodying a point of view or “social horizon,” language acquired an orientation to the future, an unsettled historical intentionality, it otherwise lacked. Third, heteroglossia was a subaltern practice, concentrated in a number of cultural forms, all of which took a parodic, ironizing stance in relation to the official literary language that dominated them. Throughout his discussion, however, Bakhtin wavers between claiming this heteroglossia exists as such in the social world, from which the novel picks it up, and arguing that heteroglossia is something created and institutionalized by novels, which take the raw material of variation and rework it into “images of a language.” Interestingly, from roughly 2000 on work in sociolinguistics has suggested that ordinary speakers do the kind of stylizing and imaging work Bakhtin assigned to the novel alone. One could argue, however, that heteroglossia only acquires its full significance and force when it is freed from any social function and allowed to flourish in novels. According to Bakhtin, that means that heteroglossia is only possible in modernity, because it is in modernity that society becomes truly historical, and languages only acquire their orientation to the future in those circumstances.
While obscenity is notoriously difficult to define and the test for determining obscenity has shifted over time, typically the term has referred to the crime of publishing prohibited, sexually explicit material. Obscenity has always been a criminal offense in the United States. Citing English common law, judges in the early republic and antebellum periods maintained that obscenity threatened to degrade the nation’s character. Nevertheless, obscenity law was not strongly or consistently enforced throughout the United States until the Comstock Act in 1873. Anthony Comstock, founder of the New York Society for the Suppression of Vice, targeted Walt Whitman’s Leaves of Grass along with publications by advocates for feminism, free love, and birth control. American courts adopted the test put forth by Lord Chief Justice Sir Alexander Cockburn in Regina v. Hicklin (1868), which held that obscenity was defined by “the tendency . . . to deprave and corrupt those whose minds are open to such immoral influences, and into whose hands a publication of this sort may fall.” Obscenity became a battleground not only for debates about gender and sexual politics but also about the nature of the public sphere. During the 20th century, American literary presses and magazines became increasingly willing to challenge bans on sexually explicit speech, publishing controversial works including The Well of Loneliness by Radclyffe Hall and Ulysses by James Joyce. Modernist authors transgressed the legal bounds of propriety to explore the unconscious, fight for erotic pleasure free from heteronormative restraints, or claim aesthetic autonomy from moral and legal restrictions. United States v. One Book Called “Ulysses” (1933) struck a blow against the Hicklin test. Affirming Judge John M. Woolsey’s not guilty verdict, Judge Augustus Hand proposed a new test for obscenity that anticipated many of the themes that would emerge when the Supreme Court took up this question with Roth v. United States (1957), which defined obscenity as “whether to the average person, applying contemporary community standards, the dominant theme of the material taken as a whole appeals to the prurient [i.e., sexual] interest.” The Court liberalized obscenity law even as it maintained restrictions on pornographic literature, setting off a wave of censorship cases including trials on Howl and Other Poems by Allen Ginsberg, Lady Chatterley’s Lover by D. H. Lawrence, Tropic of Cancer by Henry Miller, and Naked Lunch by William S. Burroughs. After Roth, lawyers defending borderline obscene publishers pushed for courts to hold that a work could not be obscene if it possessed any redeeming literary or social value. Free speech libertarians succeeded with Memoirs v. Massachusetts (1966) and Redrup v. New York (1967). Although Miller v. California (1973) clawed back this ruling by stipulating that a work must possess “serious literary, artistic, political, or scientific value” to be cleared of obscenity, in the 21st century obscenity convictions for publishing textual media have been limited to a handful of cases concerning pornographic depictions of child sexual abuse. Obscenity remains on the books but largely unenforced for literature.
The history of Brazilian print culture is closely connected to the establishment of national literature in the 19th century. Indeed, after three centuries of prohibition of printing activity in the colony by the Portuguese Crown, Impresão Régia, the first legal printing establishment in Brazil, was created in 1808 due to the arrival of the Portuguese royal family during the Napoleonic wars. From the late implementation of Imprensa Régia, which became Typographia Nacional after the independence of Brazil in 1822, to the consolidation of the publishing world in the second half of the century, marked by the controversial French presence, the discourses on literature and print production modes tend to reflect the different circulation spheres. In fact, following the long period of colonization under Portuguese rule, print production modes were implemented simultaneously with the consolidation of a broad print culture, characterized by the growth of newspapers, the circulation of images, and the impactful arrival of the novel. Undeniably, the sudden and concurrent arrival of the two worlds—technical and cultural—in addition to the paradoxical development of the print world, marked by its two technical systems—artisanal and industrial—strongly influenced the material aspects of 19th-century Brazilian publishing production. In this context, under the argument of an alleged precariousness of local print production, writers, critics, typographers, engravers, and bookbinders created literary and editorial polemics in newspapers, magazines, and books that contributed to the very construction of a “literary system.” Despite the intrinsic relationships established between literature and publishing, the multidisciplinary field of the history of the book insists on separating approaches dedicated to the technical production processes and the material analysis of objects of written culture from the approaches dedicated to print circulation and uses. Understanding the contradictions imposed by the simultaneous implementation of two technical systems, which are found when analyzing the traces left by the print equipment supply trade and the conditions to build a printing workshop, contributes to understanding the historical conditions of print production. In this sense, the historiographical perspective dialogues with heritage studies in the notion of printing heritage, understood in its tangible and intangible dimensions, considering the machines and tools of the past, together with the techniques then in use. In fact, while bringing together a set of material, technical, and mechanical elements of different production modes, printing heritage also contains the memories of the human actions that set them in motion.
Queer theory describes a network of critiques emerging from a legacy of activism and looking ahead to utopian futures. The analytical tools queer theory provides as a mode of close reading and critique makes it a relevant contemporary approach to literary theory. Beyond reading for queer characters and desires in texts, queer theory is a tool for seeing below the superficial, and supporting unconventional readings that deconstruct normative assumptions. The activist roots of queer theory in the 1969 Stonewall Riots places drag, trans issues, class, race, violence, gender, and sexuality at the heart of queer theorizing. Subsequent work engages topics such as temporality, ecology, geography, and diaspora through the analysis of culture and politics, but also literature, film, music, and other media. Queer theory attends to both the rhetorical power of language and the broader structures of knowledge formulation. As feminist epistemology asks whose knowledge matters and who creates knowledge, queer theory asks whether knowledge matters and whether naturalized knowledge is constructed. Textual or discursive construction of knowledge is a key theoretical approach of queer theory with important implications for literature. Queer theory embraces a multidisciplinary and diverse set of influences, methodologies, questions, and formats. The critiques can be applied to help deconstruct naturalized epistemic frameworks around topics notably including, language, gender, sexuality, history, the subject, universality, the environment, animals, borders, space, time, norms, ideals, reproduction, utopia, love, the home, the nation, and power. Queer theory empowers novel readings of the world, and worldly readings of the novel, opening up new ways of viewing life and text.
The topic of rhythm in literary theory draws both on discussions of poetry and prose and on much broader currents of thought in the natural sciences and philosophy. In Western thought, rhythm was a central focus of attention in ancient Greece, in the 19th and early 20th centuries, when theorists and practitioners of literature and the other arts often referred back to classical models. This is also the case in more recent theorizing of rhythm in the context of everyday life in advanced modern or, as some would say, postmodern societies. Nietzsche, who constantly circled around the term and with frequent direct and metaphorical references to dance, is in many ways the central figure in these discussions. He was massively influential after his death in 1900, both in Germany and more widely, for example, in Britain and North America, and he was taken up again, along with Heidegger, in much French thought after World War 2. Contemporary debates around rhythm and its relation to meter continue to refer to classical Greece, and in Chinese and Indian thought there is a similar continuity of attention to issues of rhythm.
Speech Act Theory is the application to spoken and written language of the philosophy of action developed by John L. Austin. Austin was particularly interested in conventionalized actions, which have a special significance thanks to their social or institutional context. Although he emphasized that such actions could also be carried out through non-verbal means, Austin is mostly remembered for his analysis of the ways in which they can be carried out through the utterance of words—hence the term “Speech Act Theory,” and the title under which his lecture series on the topic was posthumously published (i.e., How to Do Things with Words). He described utterances that perform such actions as “performative utterances.” But he also effectively argued that all utterances are performative—or rather, that all utterances have a performative or “illocutionary” aspect. Austin’s analysis of speech as action provides scholars with a way of looking at verbal behavior that relates spoken and written utterances to the circumstances of their production and deployment without reducing their meanings to authorial intentions conceived as mental states. As such, it has intrinsic appeal to scholars of literature, who have since the 1970s often distanced themselves both from psychological and from purely formal conceptions of literature. However, engagements with Speech Act Theory by literary and cultural theorists have often been superficial (for example, in the commonplace but spurious association of Austin’s account of performative utterances with the unrelated idea that gender is performative). Indeed, the fundamental concepts of Speech Act Theory have usually been misunderstood and misrepresented within literary studies because its core concerns are quite alien to that discipline’s central preoccupation: that is, the critical interpretation of literary texts.
Textual studies describes a range of fields and methodologies that evaluate how texts are constituted both physically and conceptually, document how they are preserved, copied, and circulated, and propose ways in which they might be edited to minimize error and maximize the text’s integrity. The vast temporal reach of the history of textuality—from oral traditions spanning thousands of years and written forms dating from the 4th millenium bce to printed and digital text forms—is matched by its geographical range covering every linguistic community around the globe. Methods of evaluating material text-bearing documents and the reliability of their written or printed content stem from antiquity, often paying closest attention to sacred texts as well as to legal documents and literary works that helped form linguistic and social group identity. With the incarnation of the printing press in the early modern West, the rapid reproduction of text matter in large quantities had the effect of corrupting many texts with printing errors as well as providing the technical means of correcting such errors more cheaply and quickly than in the preceding scribal culture. From the 18th century, techniques of textual criticism were developed to attempt systematic correction of textual error, again with an emphasis on scriptural and classical texts. This “golden age of philology” slowly widened its range to consider such foundational medieval texts as Dante’s Commedia as well as, in time, modern vernacular literature. The technique of stemmatic analysis—the establishment of family relationships between existing documents of a text—provided the means for scholars to choose between copies of a work in the pursuit of accuracy. In the absence of original documents (manuscripts in the hand of Aristotle or the four Evangelists, for example) the choice between existing versions of a text were often made eclectically—that is, drawing on multiple versions—and thus were subject to such considerations as the historic range and geographical diffusion of documents, the systematic identification of common scribal errors, and matters of translation. As the study of modern languages and literatures consolidated into modern university departments in the later 19th century, new techniques emerged with the aim of providing reliable literary texts free from obvious error. This aim had in common with the preceding philological tradition the belief that what a text means—discovered in the practice of hermeneutics—was contingent on what the text states—established by an accurate textual record that eliminates error by means of textual criticism. The methods of textual criticism took several paths through the 20th century: the Anglophone tradition centered on editing Shakespeare’s works by drawing on the earliest available documents—the printed Quartos and Folios—developing into the Greg–Bowers–Tanselle copy-text “tradition” which was then deployed as a method by which to edit later texts. The status of variants in modern literary works with multiple authorial manuscripts—not to mention the existence of competing versions of several of Shakespeare’s plays—complicated matters sufficiently that editors looked to alternate editorial models. Genetic editorial methods draw in part on German editorial techniques, collating all existing manuscripts and printed texts of a work in order to provide a record of its composition process, including epigenetic processes following publication. The French methods of critique génétique also place the documentary record at the center, where the dossier is given priority over any one printed edition, and poststructuralist theory is used to examine the process of “textual invention.” The inherently social aspects of textual production—the author’s interaction with agents, censors, publishers, and printers and the way these interactions shape the content and presentation of the text—have reconceived how textual authority and variation are understood in the social and economic contexts of publication. And, finally, the advent of digital publication platforms has given rise to new developments in the presentation of textual editions and manuscript documents, displacing copy-text editing in some fields such as modernism studies in favor of genetic or synoptic models of composition and textual production.