You are looking at 221-240 of 364 articles
Dynamic and creative exchanges among different religions, including indigenous traditions, Protestant and Catholic Christianity, and Islam, all with developing theologies and institutions, fostered substantial collective religious and cultural identities within African American communities in the United States. The New World enslavement of diverse African peoples and the cultural encounter with Europeans and Native Americans produced distinctive religious perspectives that aided individuals and communities in persevering under the dehumanization of slavery and oppression. As African Americans embraced Christianity beginning in the 18th century, especially after 1770, they gathered in independent church communities and created larger denominational structures such as the African Methodist Episcopal Church, the African Methodist Episcopal Zion Church, and the National Baptist Convention. These churches and denominations became significant arenas for spiritual support, educational opportunity, economic development, and political activism. Black religious institutions served as contexts in which African Americans made meaning of the experience of enslavement, interpreted their relationship to Africa, and charted a vision for a collective future. The early 20th century saw the emergence of new religious opportunities as increasing numbers of African Americans turned to Holiness and Pentecostal churches, drawn by the focus on baptism in the Holy Spirit and enthusiastic worship that sometimes involved speaking in tongues. The Great Migration of southern blacks to southern and northern cities fostered the development of a variety of religious options outside of Christianity. Groups such as the Moorish Science Temple and the Nation of Islam, whose leaders taught that Islam was the true religion of people of African descent, and congregations of Ethiopian Hebrews promoting Judaism as the heritage of black people, were founded in this period. Early-20th-century African American religion was also marked by significant cultural developments as ministers, musicians, actors, and other performers turned to new media, such as radio, records, and film, to contribute to religious life. In the post–World War II era, religious contexts supported the emergence of the modern Civil Rights movement. Black religious leaders emerged as prominent spokespeople for the cause and others as vocal critics of the goal of racial integration, as in the case of the Nation of Islam and religious advocates of Black Power. The second half of the 20th century and the early 21st-first century saw new religious diversity as a result of immigration and cultural transformations within African American Christianity with the rise of megachurches and televangelism.
Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.
Kyle B. Roberts
From Cahokia to Newport, from Santa Fe to Chicago, cities have long exerted an important influence over the development of American religion; in turn, religion has shaped the life of America’s cities. Early visions of a New Jerusalem quickly gave way to a crowded spiritual marketplace full of faiths competing for the attention of a heterogeneous mass of urban consumers, although the dream of an idealized spiritual city never completely disappeared. Pluralism fostered toleration and freedom of religious choice, but also catalyzed competition and antagonism, sometimes resulting in violence. Struggles over political authority between established and dissenting churches gave way after the American Revolution to a contest over the right to exert moral authority through reform. Secularization, the companion of modernization and urbanization, did not toll the death knell for urban religion, but instead, provided the materials with which the religious engaged the city. Negative discursive constructions of the city proffered by a handful of religious reformers have long cast a shadow over the actual urban experience of most men and women. Historians continue to uncover the rich and innovative ways in which urban religion enabled individuals to understand, navigate, and contribute to the city around them.
Christopher D. Cantwell
Home to more than half the U.S. population by 1920, cities played an important role in the development of American religion throughout the 20th century. At the same time, the beliefs and practices of religious communities also shaped the contours of America’s urban landscape. Much as in the preceding three centuries, the economic development of America’s cities and the social diversity of urban populations animated this interplay. But the explosive, unregulated expansion that defined urban growth after the Civil War was met with an equally dramatic disinvestment from urban spaces throughout the second half of the 20th century. The domestic and European migrations that previously fueled urban growth also changed throughout the century, shifting from Europe and the rural Midwest to the deep South, Africa, Asia, and Latin America after World War II. These newcomers not only brought new faiths to America’s cities but also contributed to the innovation of several new, distinctly urban religious movements. Urban development and diversity on one level promoted toleration and cooperation as religious leaders forged numerous ecumenical and, eventually, interfaith bonds to combat urban problems. But it also led to tension and conflict as religious communities busied themselves with carving out spaces of their own through tight-knit urban enclaves or new suburban locales. Contemporary American cities are some of the most religiously diverse communities in the world. Historians continue to uncover how religious communities not only have lived in but also have shaped the modern city.
Cara L. Burnidge
Since 2001, there has been a noticeable increase in the number of scholarly monographs dedicated to religion and foreign relations. More scholars and policymakers agree that religion is an important feature of foreign affairs, regardless of whether one thinks it ought to be. While policymakers and scholars often discuss “religion” as a single “lens” for understanding the world, religious traditions do not exist in isolation from the political, economic, or social and cultural aspects of life. Tracing religious influences on U.S. foreign policy, then, can lead scholars in a variety of directions. Scholars researching religious influences in foreign policy could consider theologies and creeds of religious organizations and figures, the rhetoric and rituals of national norms and civic values, the intersection of “sacred” and “secular” ideas and institutions, the service of individual policymakers and diplomats, international legal or military defenses for or against specific religious groups, or public discourse about religion, to name but a few options.
Advances in the study of religion and foreign policy will require collaboration and dialogue across traditional boundaries for disciplines, fields, and subfields. For many scholars, this means broadening research approaches and methods. Instead of prioritizing “first-” and “second-” order causes, for instance, historians and social scientists could move beyond cause-effect relationships alone, complicating U.S. foreign relations by considering intersectional experiences and interstitial explanations. Rather than looking for “the” univocal religious influence, scholars might pay greater attention to the multiplicity of “religious” influences on a given topic. This will likely occur by reading and researching beyond one specific area of expertise. It will also require attention to differentiating between institutional and “popular” or “lived” religion; recognizing the disparities between the official dogma of a religious affiliation and ethnographic and empirical data on religious practice; and giving attention to the underlying assumptions that occur when international organizations, national governments, and scholars choose to pay attention to certain forms of “religious” thought, behavior, and organizations and not others.
Jane H. Hong
Laws barring Asians from legal immigration and naturalization in the United States began with the Chinese Exclusion Act of 1882 and expanded to include all other Asian groups by 1924. Beginning in World War II, U.S. lawmakers began to dismantle the Asian exclusion regime in response to growing international pressure and scrutiny of America’s racial policies and practices. The Japanese government sought to use the U.S. Asian exclusion laws to disrupt the Sino-American alliance of World War II, causing Washington officials to recognize these laws as a growing impediment to international diplomacy and the war effort. Later, the Soviet Union and other communist powers cited U.S. exclusion policies as evidence of American racial hypocrisy during the Cold War.
A diverse group of actors championed the repeal of Asian exclusion laws over the 1940s and early 1950s. They included former American missionaries to Asia, U.S. and Asian state officials, and Asian and Asian American activists. The movement argued for repeal legislation as an inexpensive way for the United States to demonstrate goodwill, counter foreign criticism, and rehabilitate America’s international image as a liberal democracy. Drawing upon the timely language and logic of geopolitics, advocates lobbied Congressional lawmakers to pass legislation ending the racial exclusion of Asians from immigration and naturalization eligibility, in support of U.S. diplomatic and security interests abroad.
David L. Prentice
The history of the Republican Party’s foreign policy reminds historians that national politics often entails efforts to hold together a diverse coalition. The party’s regional alignments, ideas, and positions were seldom static. Rarely has it enjoyed unity on foreign relations. Intra-party differences mattered as wings, factions, and insurgents feuded over both domestic policy and America’s aims, interests, and engagement with the world. Mugwumps, jingoes, insurgents, Irreconcilables, the Republican Right, and neoconservatives, among others, interpreted events differently. These differences modulated the party’s swings from isolationism to interventionism, pulling it closer to the center of American politics.
Regarding foreign relations, Republicans have generally united around five themes. First, there existed a common understanding that US interests were paramount in defining foreign policy. A shared “America first” ethos made Republicans wary of liberal internationalism and reluctant to concede any autonomy on foreign or economic affairs. While different wings of the Republican Party may have backed divergent policies, each agreed the United States should preserve its flexibility and engage in unilateral action when necessary. Second, Republicans have supported preparedness for national defense and military superiority even when members may oppose US intervention in a foreign conflict. As for diplomacy, they maintained sound negotiations would come from victory or positions of strength. In a world of dangers, the strong survive. Third, the nature of the foreign foe mattered. Republicans opposed revolutionary regimes abroad whereas anti-fascist or anti-authoritarian causes drew weak or belated interest. The common Republican perception that the Soviet Union posed a greater threat to the international order than Nazi Germany accounted for much of the party’s isolationism before World War II. And during the Cold War, Republicans frequently turned a blind eye to the human rights and political abuses of America’s allies while condemning communist nations for the same. Fourth, the Republican preference for limited government influenced how they approached armed conflict. They resisted large peacetime armies and land wars while, in recent eras, placing inordinate faith in modern firepower to deter enemies and accomplish swift victory when used properly. They feared long wars encouraged the growth of the federal government. Finally, opposition to Democratic alternatives, especially in an election year, could bridge some of the party’s greatest chasms.
The story of the pre-Columbian Mississippi Period (1000
Rock and roll, a popular music craze of the mid-1950s, turned a loud, fast, and sexy set of sounds rooted in urban, black, working class, and southern America into the pop preference as well of suburban, white, young, and northern America. By the late 1960s, those fans and British counterparts made their own version, more politicized and experimental and just called rock—the summoning sound of the counterculture. Rock’s aura soon faded: it became as much entertainment staple as dissident form, with subcategories disparate as singer-songwriter, heavy metal, alternative, and “classic rock.” Where rock and roll was integrated and heterogeneous, rock was largely white and homogeneous, policing its borders. Notoriously, rock fans detonated disco records in 1979. By the 1990s, rock and roll style was hip-hop, with its youth appeal and rebelliousness; post‒baby boomer bands gave rock some last vanguard status; and suburbanites found classic rock in New Country. This century’s notions of rock and roll have blended thoroughly, from genre “mash-ups” to superstar performers almost categories unto themselves and new sounds such as EDM beats. Still, crossover moments evoke rock and roll; assertions of authenticity evoke rock. Because rock and roll, and rock, epitomize cultural ideals and group identities, their definitions have been constantly debated. Initial argument focused on challenging genteel, professional notions of musicianship and behavior. Later discourse took up cultural incorporation and social empowerment, with issues of gender and commercialism as prominent as race and artistry. Rock and roll promised one kind of revolution to the post-1945 United States; rock another. The resulting hope and confusion has never been fully sorted, with mixed consequences for American music and cultural history.
Decided by the Supreme Court in 1973, Roe v. Wade legalized abortion across the United States. The 7-2 decision came at the end of a decades-long struggle to reform—and later repeal—abortion laws. Although all of the justices understood that Roe addressed a profoundly important question, none of them imagined that it would later become a flashpoint of American politics or shape those politics for decades to come.
Holding that the right to privacy covered a woman’s choice to terminate her pregnancy, Roe and its companion case, Doe v. Bolton, struck down many of the abortion regulations on the books. The lead-up to and aftermath of Roe tell a story not only of a single Supreme Court decision but also of the historical shifts that the decision shaped and reflected: the emergence of a movement for women’s liberation, the rise of grassroots conservatism, political party realignment, controversy about the welfare state, changes to the family structure, and the politicization of science. It is a messy and complicated story that evolved parallel to different ideas about the decision itself. In later decades, Roe arguably became the best-known opinion issued by the Supreme Court, a symbol of an ever-changing set of beliefs about family, health care, and the role of the judiciary in American democracy.
Kristin L. Ahlberg
In the 20th century, US policymakers often attempted to solve domestic agricultural oversupply problems by extending food aid to foreign recipients. In some instances, the United States donated food in times of natural disasters. In other instances, the United States offered commodities to induce foreign governments to support US foreign policy aims or to spur agricultural modernization. These efforts coalesced during the 1950s with the enactment of Public Law 480, commonly known as the Food for Peace program, which provided for a formal, bureaucratic mechanism for the disbursement of commodities. Throughout the second half of the 20th century, successive presidential administrations continued to deploy commodities in advance of their often disparate foreign policy objectives.
Clay Silver Katsky
While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.
In 1835, Alexis de Tocqueville argued in Democracy in America that there were “two great nations in the world.” They had started from different historical points but seemed to be heading in the same direction. As expanding empires, they faced the challenges of defeating nature and constructing a civilization for the modern era. Although they adhered to different governmental systems, “each of them,” de Tocqueville declared, “seems marked out by the will of Heaven to sway the destinies of half the globe.”
De Tocqueville’s words were prophetic. In the 19th century, Russian and American intellectuals and diplomats struggled to understand the roles that their countries should play in the new era of globalization and industrialization. Despite their differing understandings of how development should happen, both sides believed in their nation’s vital role in guiding the rest of the world. American adherents of liberal developmentalism often argued that a free flow of enterprise, trade, investment, information, and culture was the key to future growth. They held that the primary obligation of American foreign policy was to defend that freedom by pursuing an “open door” policy and free access to markets. They believed that the American model would work for everyone and that the United States had an obligation to share its system with the old and underdeveloped nations around it.
A similar sense of mission developed in Russia. Russian diplomats had for centuries struggled to establish defensive buffers around the periphery of their empire. They had linked economic development to national security, and they had argued that their geographic expansion represented a “unification” of peoples as opposed to a conquering of them. In the 19th century, after the Napoleonic Wars and the failed Decembrist Revolution, tsarist policymakers fought to defend autocracy, orthodoxy, and nationalism from domestic and international critics. As in the United States, Imperial and later Soviet leaders envisioned themselves as the emissaries of the Enlightenment to the backward East and as protectors of tradition and order for the chaotic and revolutionary West.
These visions of order clashed in the 20th century as the Soviet Union and the United States became superpowers. Conflicts began early, with the American intervention in the 1918–1921 Russian civil war. Tensions that had previously been based on differing geographic and strategic interests then assumed an ideological valence, as the fight between East and West became a struggle between the political economies of communism and capitalism. Foreign relations between the two countries experienced boom and bust cycles that took the world to the brink of nuclear holocaust and yet maintained a strategic balance that precluded the outbreak of global war for fifty years. This article will examine how that relationship evolved and how it shaped the modern world.
Robert O. Self
Few decades in American history reverberate with as much historical reach or glow as brightly in living mythology as the 1960s. During those years Americans reanimated and reinvented the core political principles of equality and liberty but, in a primal clash that resonates more than half a century later, fiercely contested what those principles meant, and for whom. For years afterward, the decade’s appreciators considered the era to have its own “spirit,” defined by greater freedoms and a deeper, more authentic personhood, and given breath by a youthful generation’s agitation for change in nearly every dimension of national life. To its detractors in subsequent decades, the era was marked by immature radical fantasies and dangerous destabilizations of the social order, behind which lay misguided youthful enthusiasms and an overweening, indulgent federal government. We need not share either conviction to appreciate the long historical shadow cast by the decade’s clashing of left, right, and center and its profound influence over the political debates, cultural logics, and social practices of the many years that followed.
The decade’s political and ideological clashes registered with such force because post–World War II American life was characterized by a society-wide embrace of antiradicalism and a prescribed normalcy. Having emerged from the war as the lone undamaged capitalist industrial power, the United States exerted enormous influence throughout the globe after 1945—so much that some historians have called the postwar years a “pax Americana.” In its own interest and in the interest of its Western allies, the United States engaged in a Cold War standoff with the Soviet Union over the fate of Europe and no less over the fate of developing countries on every continent. Fiercely anticommunist abroad and at home, U.S. elites stoked fears of the damage communism could do, whether in Eastern Europe or in a public school textbook. Americans of all sorts in the postwar years embraced potent ideologies justifying the prevailing order, whether that order was capitalist, patriarchal, racial, or heterosexual. They pursued a postwar “normalcy” defined by nuclear family domesticity and consumer capitalism in the shadow cast by the threat of communism and, after 1949, global thermonuclear war with the Soviet Union. This prevailing order was stultifying and its rupture in the 1960s is the origin point of the decade’s great dramas.
The social movements of that decade drew Americans from the margins of citizenship—African Americans, Latina/o, Native Americans, women, and gay men and lesbians, among others—into epochal struggles over the withheld promise of equality. For the first time since 1861, an American war deeply split the nation, nearly destroying a major political party and intensifying a generational revolt already under way. Violence, including political assassinations at the highest level, bombings and assassinations of African Americans, bombings by left-wing groups like the Weathermen, and major urban uprisings by African Americans against police and property bathed the country in more blood. The New Deal liberalism of Presidents Franklin D. Roosevelt and Harry S. Truman reached its postwar peak in 1965 under President Lyndon Johnson’s Great Society and then retreated amid acrimony and backlash, as a new conservative politics gained traction. All this took place in the context of a “global 1960s,” in which societies in Western and Eastern Europe, Latin America, Africa, and elsewhere experienced similar generational rebellions, quests for meaningful democracy, and disillusionment with American global hegemony. From the first year of the decade to the last, the 1960s were a watershed era that marked the definitive end of a “postwar America” defined by easy Cold War dualities, presumptions of national innocence, and political calcification.
To explain the foregoing, this essay is organized in five sections. First comes a broad overview of the decade, highlighting some of its indelible moments and seminal political events. The next four sections correspond to the four signature historical developments of the 1960s. Discussed first is the collapse of the political consensus that predominated in national life following World War II. We can call this consensus “Vital Center liberalism,” after the title of a 1949 book by Arthur Schlesinger Jr., or “Cold War liberalism.” Its assault from both the New Left and the New Right is one of the defining stories of the 1960s. Second is the resurgence, after a decades-long interregnum dating to Reconstruction, of African American political agency. The black freedom struggle of the 1960s was far more than a social movement for civil rights. To shape the conditions of national life and the content of public debate in ways impossible under Jim Crow, black American called for nothing less than a spiritual and political renewal of the country. Third, and following from the latter, is the emergence within the American liberal tradition of a new emphasis on expanding individual rights and ending invidious discrimination. Forged in conjunction with the black freedom movement by women, Latino/as, Asian Americans, Native Americans, and homophiles (as early gay rights activists were called) and gay liberationists, this new emphasis profoundly changed American law and set the terms of political debate for the next half century. Fourth and lastly, the 1960s witnessed the flourishing of a broad and diverse culture of anti-authoritarianism. In art, politics, and social behavior, this anti-authoritarianism took many forms, but at its heart lay two distinct historical phenomena: an ecstatic celebration of youth, manifest in the tension between the World War II generation and the baby boom generation, and an intensification of the long-standing conflict in American life between individualism and hierarchical order.
Despite the disruptions, rebellions, and challenges to authority in the decade, the political and economic elite proved remarkably resilient and preserved much of the prevailing order. This is not to discount the foregoing account of challenges to that order or to suggest that social change in the 1960s made little difference in American life. However, in grappling with this fascinating decade we are confronted with the paradox of outsized events and enormous transformations in law, ideology, and politics alongside a continuation, even an entrenchment, of traditional economic and political structures and practices.
The decade of the 1980s represented a turning point in American history—a crucial era, marked by political conservatism and an individualistic ethos. The 1980s also witnessed a dramatic series of developments in U.S. foreign relations, first an intensification of the Cold War with the Soviet Union and then a sudden relaxation of tensions and the effective end of the Cold War with an American victory. All of these developments were advanced and symbolized in the presidential administration of Ronald Reagan (1981–1989), a polarizing figure but a highly successful political leader. Reagan dominates our memories of the 1980s like few other American leaders do other eras. Reagan and the political movement he led—Reaganism—are central to the history of the 1980s. Both their successes and their failures, which became widely acknowledged in the later years of the decade, should be noted. Reaganite conservatives won political victories by rolling back state power in many realms, most of all in terms of taxation and regulation. They also succeeded in putting America at the unquestioned pinnacle of the world order through a victory over the Soviet Union in the Cold War, although this was unforeseen by America’s Cold Warriors when the 1980s began. The failures of Reaganite conservatism include its handling of rising poverty levels, the HIV/AIDS crisis, and worsening racial tensions, all problems that either Reaganites did little to stem or to which they positively contributed. In foreign affairs, Reaganites pursued a “war on terror” of questionable success, and their approach to Third World arenas of conflict, including Central America, exacted a terrible human toll.
The 1950s have typically been seen as a complacent, conservative time between the end of World War II and the radical 1960s, when anticommunism and the Cold War subverted reform and undermined civil liberties. But the era can also be seen as a very liberal time in which meeting the Communist threat led to Keynesian economic policies, the expansion of New Deal programs, and advances in civil rights. Politically, it was “the Eisenhower Era,” dominated by a moderate Republican president, a high level of bipartisan cooperation, and a foreign policy committed to containing communism. Culturally, it was an era of middle-class conformity, which also gave us abstract expressionism, rock and roll, Beat poetry, and a grassroots challenge to Jim Crow.
Emerson W. Baker
The Salem Witch Trials are one of the best known, most studied, and most important events in early American history. The afflictions started in Salem Village (present-day Danvers), Massachusetts, in January 1692, and by the end of the year the outbreak had spread throughout Essex County, and threatened to bring down the newly formed Massachusetts Bay government of Sir William Phips. It may have even helped trigger a witchcraft crisis in Connecticut that same year. The trials are known for their heavy reliance on spectral evidence, and numerous confessions, which helped the accusations grow. A total of 172 people are known to have been formally charged or informally cried out upon for witchcraft in 1692. Usually poor and marginalized members of society were the victims of witchcraft accusations, but in 1692 many of the leading members of the colony were accused. George Burroughs, a former minister of Salem Village, was one of the nineteen people convicted and executed. In addition to these victims, one man, Giles Cory, was pressed to death, and five died in prison. The last executions took place in September 1692, but it was not until May 1693 that the last trial was held and the last of the accused was freed from prison.
The trials would have lasting repercussions in Massachusetts and signaled the beginning of the end of the Puritan City upon a Hill, an image of American exceptionalism still regularly invoked. The publications ban issued by Governor Phips to prevent criticism of the government would last three years, but ultimately this effort only ensured that the failure of the government to protect innocent lives would never be forgotten. Pardons and reparations for some of the victims and their families were granted by the government in the early 18th century, and the legislature would regularly take up petitions, and discuss further reparations until 1749, more than fifty years after the trials. The last victims were formally pardoned by the governor and legislature of Massachusetts in 2001.
Rachel Hope Cleves
The task of recovering the history of same-sex love among early American women faces daunting challenges of definition and sources. Modern conceptions of same-sex sexuality did not exist in early America, but alternative frameworks did. Many indigenous nations had social roles for female-bodied individuals who lived as men, performed male work, and acquired wives. Early Christian settlers viewed sexual encounters between women as sodomy, but also valued loving dyadic bonds between religious women. Primary sources indicate that same-sex sexual practices existed within western and southern African societies exploited by the slave trade, but little more is known. The word “lesbian” has been used to signify erotics between women since roughly the 10th century, but historians must look to women who led lesbian-like lives in early America rather than to women who self-identified as lesbians. Stories of female husbands who passed as men and married other women were popular in the 18th century. Tales of passing women who served in the military, in the navy, and as pirates also amused audiences and raised the spectre of same-sex sexuality. Some female religious leaders trespassed conventional gender roles and challenged the marital sexual order. Other women conformed to female gender roles, but constructed loving female households. 18th-century pornography depicting lesbian sexual encounters indicates that early Americans were familiar with the concept of sex between women. A few court records exist from prosecutions of early American women for engaging in lewd acts together. Far more common, by the end of the 18th century, were female-authored letters and diaries describing the culture of romantic friendship, which sometimes extended to sexual intimacy. Later in the 19th century, romantic friendship became an important ingredient in the development of lesbian culture and identity.
The United States was extremely reluctant to get drawn into the wars that erupted in Asia in 1937 and Europe in 1939. Deeply disillusioned with the experience of World War I, when the large number of trench warfare casualties had resulted in a peace that many American believed betrayed the aims they had fought for, the United States sought to avoid all forms of entangling alliances. Deeply embittered by the Depression, which was widely blamed on international bankers and businessmen, Congress enacted legislation that sought to prevent these actors from drawing the country into another war. The American aim was neutrality, but the underlying strength of the United States made it too big to be impartial—a problem that Roosevelt had to grapple with as Germany, Italy, and Japan began to challenge international order in the second half of the 1930s.
Ansley T. Erickson
“Urban infrastructure” calls to mind railways, highways, and sewer systems. Yet the school buildings—red brick, limestone, or concrete, low-slung, turreted, or glass-fronted—that hold and seek to shape the city’s children are ubiquitous forms of infrastructure as well. Schools occupy one of the largest line items in a municipal budget, and as many as a fifth of a city’s residents spend the majority of their waking hours in school classrooms, hallways, and gymnasiums. In the 19th and 20th centuries urban educational infrastructure grew, supported by developing consensus for publicly funded and publicly governed schools (if rarely fully accessible to all members of the public). Even before state commitment to other forms of social welfare, from pensions to public health, and infrastructure, from transit to fire, schooling was a government function.
This commitment to public education ultimately was national, but schools in cities had their own story. Schooling in the United States is chiefly a local affair: Constitutional responsibility for education lies with the states; power is then further decentralized as states entrust decisions about school function and funding to school districts. School districts can be as small as a single town or a part of a city. Such localism is one reason that it is possible to speak about schools in U.S. cities as having a particular history, determined as much by the specificities of urban life as by national questions of citizenship, economy, religion, and culture.
While city schools have been distinct, they have also been nationally influential. Urban scale both allowed for and demanded the most extensive educational system-building. Urban growth and diversity galvanized innovation, via exploration in teaching methods, curriculum, and understanding of children and communities. And it generated intense conflict. Throughout U.S. history, urban residents from myriad social, political, religious, and economic positions have struggled to define how schools would operate, for whom, and who would decide.
During the 19th and 20th centuries, U.S. residents struggled over the purposes, funding, and governance of schools in cities shaped by capitalism, nativism, and white supremacy. They built a commitment to schooling as a public function of their cities, with many compromises and exclusions. In the 21st century, old struggles re-emerged in new form, perhaps raising the question of whether schools will continue as public, urban infrastructure.