The US Catholic Church was for most of its history—and, in many places, still is—a working-class church. The choice for worship by successive waves of immigrants, from the Irish to the Polish to the Mexican, the Church, once it had created an institutional presence, welcomed “these strangers in a strange land.” These immigrants play a major role in creating and sustaining parishes that served both as a soul-sustaining refuge and, in many cases, a way station to the outside world. James Cardinal Gibbons, having learned from the central role that Irish workers played in the Knights of Labor and protests against the excommunication of the radical New York priest, Edward McGlynn, persuaded the Vatican to take a relatively liberal stance toward the “social question” in the United States. Rerum Novarum, the 1891 papal encyclical, condemned socialism and competitive capitalism, but more significantly asserted the “natural” right of workers to form unions as well as to have a living wage. It was within this religious legitimation of unionism that Irish Catholics came to prominence in the American Federation of Labor, that Monsignor John A. Ryan created a US Catholic social justice intellectual tradition, and that US bishops adopted the 1919 Program for Social Reconstruction. The Catholic labor moment came when the Church, led by the National Catholic Welfare Conference’s Social Action Department, midwestern bishops, and labor priests, not only supported the Congress of Industrial Organizations (CIO), but consistently pushed the New Deal to implement the 1919 program. Philip Murray, the CIO’s Catholic president, led the expulsion of the Communist-led unions when the Communist Party, in the Wallace campaign, threatened both the country and everything the CIO had built. On the one hand, this Catholic labor moment dissolved in an overdetermined mixture of complacency, capitalist growth, and anti-Communism. On the other, a direct line can be traced from California’s labor priests to the Spanish Mission Band to Cesar Chavez and the formation of the United Farm Workers. It took time for the official Church to support the farm workers, but once that happened, it was all in: the support the Church, at all levels, gave them far exceeded anything it had done previously to implement Rerum Novarum.
Religion is at the heart of the Latina/o experience in the United States. It is a deeply personal matter that often shapes political orientations, how people vote, where they live, and the type of family choices they make. Latina/o religious politics—defined as the religious beliefs, ethics, and cultures that motivate social and political action in society—represent the historic interaction between popular and institutional religion. The evolution of Protestantism, Pentecostalism, and Catholic Social Action throughout the late 19th and 20th centuries illuminates the ways in which Latina/o religious communities interacted with movements for social justice.
Perla M. Guerrero
Latinas/os were present in the American South long before the founding of the United States of America, yet knowledge about their southern communities in different places and time periods is deeply uneven. In fact, regional themes important throughout the South clarify the dynamics that shaped Latinas/os’ lives, especially race, ethnicity, and the colorline; work and labor; and migration and immigration. Ideas about racial difference, in particular, reflected specifics of place, and intersections of local, regional, and international endeavors and movements of people and resources. Accordingly, Latinas/os’ position and treatment varied across the South. They first worked in agricultural fields picking cotton, oranges, and harvesting tobacco, then in a variety of industries, especially poultry and swine processing and packing. The late 20th century saw the rapid growth of Latinas/os in southern states due to changing migration and immigration patterns that moved from traditional states of reception to new destinations in rural, suburban, and urban locales with limited histories with Latinas/os or with substantial numbers of immigrants in general.
Emily K. Hobson
Since World War II, the United States has witnessed major changes in lesbian, gay, bisexual, transgender, and queer (LGBTQ) politics. Indeed, because the history of LGBTQ activism is almost entirely concentrated in the postwar years, the LGBTQ movement is typically said to have achieved rapid change in a short period of time. But if popular accounts characterize LGBTQ history as a straightforward narrative of progress, the reality is more complex. Postwar LGBTQ politics has been both diverse and divided, marked by differences of identity and ideology. At the same time, LGBTQ politics has been embedded in the contexts of state-building and the Cold War, the New Left and the New Right, the growth of neoliberalism, and the HIV/AIDS epidemic. As the field of LGBTQ history has grown, scholars have increasingly been able to place analyses of state regulation into conversation with community-based histories. Moving between such outside and inside perspectives helps to reveal how multiple modes of LGBTQ politics have shaped one another and how they have been interwoven with broader social change. Looking from the outside, it is apparent that LGBTQ politics has been catalyzed by exclusions from citizenship; from the inside, we can see that activists have responded to such exclusions in different ways, including both by seeking social inclusion and by rejecting assimilationist terms. Court rulings and the administration of law have run alongside the debates inside activist communities. Competing visions for LGBTQ politics have centered around both leftist and liberal agendas, as well as viewpoints shaped by race, gender, gender expression, and class.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.
The history of Muslims in America dates back to the transatlantic mercantile interactions between Europe, Africa, and the Americas. Upon its arrival, Islam became entrenched in American discourses on race and civilization because literate and noble African Muslims, brought to America as slaves, had problematized popular stereotypes of Muslims and black Africans. Furthermore, these enslaved Muslims had to re-evaluate and reconfigure their beliefs and practices to form new communal relations and to make sense of their lives in America.
At the turn of the 20th century, as Muslim immigrants began arriving in the United States from the Middle East, Eastern Europe, and South Asia, they had to establish themselves in an America in which the white race, Protestantism, and progress were conflated to define a triumphalist American national identity, one that allowed varying levels of inclusion for Muslims based on their ethnic, racial, and national backgrounds.
The enormous bloodshed and destruction experienced during World War I ushered in a crisis of confidence in the ideals of the European Enlightenment, as well as in white, Protestant nationalism. It opened up avenues for alternative expressions of progress, which allowed Muslims, along with other nonwhite, non-Christian communities, to engage in political and social organization. Among these organizations were a number of black religious movements that used Islamic beliefs, rites, and symbols to define a black Muslim national identity.
World War II further shifted America, away from the religious competition that had earlier defined the nation’s identity and toward a “civil religion” of American democratic values and political institutions. Although this inclusive rhetoric was received differently along racial and ethnic lines, there was an overall appeal for greater visibility for Muslims in America. After World War II, increased commercial and diplomatic relations between the United States and Muslim-majority countries put American Muslims in a position, not only to relate Islam and America in their own lives but also to mediate between the varying interests of Muslim-majority countries and the United States.
Following the civil rights legislation of the 1950s and 1960s and the passage of the Immigration Act of 1965, Muslim activists, many of whom had been politicized by anticolonial movements abroad, established new Islamic institutions. Eventually, a window was opened between the US government and American Muslim activists, who found a common enemy in communism following the Soviet occupation of Afghanistan in the 1980s.
Since the late 1960s, the number of Muslims in the United States has grown significantly. Today, Muslims are estimated to constitute a little more than 1 percent of the US population. However, with the fall of the Soviet Union and the rise of the United States as the sole superpower in the world, the United States has come into military conflict with Muslim-majority countries and has been the target of attacks by militant Muslim organizations. This has led to the cultivation of the binaries of “Islam and the West” and of “good” Islam and “bad” Islam, which have contributed to the racialization of American Muslims. It has also interpolated them into a reality external to their history and lived experiences as Muslims and Americans.
Luke A. Nichter
Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.
The relationship between organized labor and the civil rights movement proceeded along two tracks. At work, the two groups were adversaries, as civil rights groups criticized employment discrimination by the unions. But in politics, they allied. Unions and civil rights organizations partnered to support liberal legislation and to oppose conservative southern Democrats, who were as militant in opposing unions as they were fervent in supporting white supremacy.
At work, unions dithered in their efforts to root out employment discrimination. Their initial enthusiasm for Title VII of the 1964 Civil Rights Act, which outlawed employment discrimination, waned the more the new law violated foundational union practices by infringing on the principle of seniority, emphasizing the rights of the individual over the group, and inserting the courts into the workplace. The two souls of postwar liberalism— labor solidarity represented by unions and racial justice represented by the civil rights movement—were in conflict at work.
Although the unions and civil rights activists were adversaries over employment discrimination, they united in trying to register southern blacks to vote. Black enfranchisement would end the South’s exceptionalism and the veto it exercised over liberal legislation in Congress. But the two souls of liberalism that were at odds over the meaning of fairness at work would also diverge at the ballot box. As white workers began to defect from the Democratic Party, the political coalition of black and white workers that union leaders had hoped to build was undermined from below. The divergence between the two souls of liberalism in the 1960s—economic justice represented by unions and racial justice represented by civil rights—helps explain the resurgence of conservatism that followed.
Jessica M. Chapman
The origins of the Vietnam War can be traced to France’s colonization of Indochina in the late 1880s. The Viet Minh, led by Ho Chi Minh, emerged as the dominant anti-colonial movement by the end of World War II, though Viet Minh leaders encountered difficulties as they tried to consolidate their power on the eve of the First Indochina War against France. While that war was, initially, a war of decolonization, it became a central battleground of the Cold War by 1950. The lines of future conflict were drawn that year when the Peoples Republic of China and the Soviet Union recognized and provided aid to the Democratic Republic of Vietnam in Hanoi, followed almost immediately by Washington’s recognition of the State of Vietnam in Saigon. From that point on, American involvement in Vietnam was most often explained in terms of the Domino Theory, articulated by President Dwight D. Eisenhower on the eve of the Geneva Conference of 1954. The Franco-Viet Minh ceasefire reached at Geneva divided Vietnam in two at the 17th parallel, with countrywide reunification elections slated for the summer of 1956. However, the United States and its client, Ngo Dinh Diem, refused to participate in talks preparatory to those elections, preferring instead to build South Vietnam as a non-communist bastion. While the Vietnamese communist party, known as the Vietnam Worker’s Party in Hanoi, initially hoped to reunify the country by peaceful means, it reached the conclusion by 1959 that violent revolution would be necessary to bring down the “American imperialists and their lackeys.” In 1960, the party formed the National Liberation Front for Vietnam and, following Diem’s assassination in 1963, passed a resolution to wage all-out war in the south in an effort to claim victory before the United States committed combat troops. After President John F. Kennedy took office in 1961, he responded to deteriorating conditions in South Vietnam by militarizing the American commitment, though he stopped short of introducing dedicated ground troops. After Diem and Kennedy were assassinated in quick succession in November 1963, Lyndon Baines Johnson took office determined to avoid defeat in Vietnam, but hoping to prevent the issue from interfering with his domestic political agenda. As the situation in South Vietnam became more dire, LBJ found himself unable to maintain the middle-of-the-road approach that Kennedy had pursued. Forced to choose between escalation and withdrawal, he chose the former in March 1965 by launching a sustained campaign of aerial bombardment, coupled with the introduction of the first officially designated U.S. combat forces to Vietnam.
Peace activism in the United States between 1945 and the 2010s focused mostly on opposition to U.S. foreign policy, efforts to strengthen and foster international cooperation, and support for nuclear nonproliferation and arms control. The onset of the Cold War between the United States and the Soviet Union marginalized a reviving postwar American peace movement emerging from concerns about atomic and nuclear power and worldwide nationalist politics that everywhere seemed to foster conflict, not peace. Still, peace activism continued to evolve in dynamic ways and to influence domestic politics and international relations.
Most significantly, peace activists pioneered the use of Gandhian nonviolence in the United States and provided critical assistance to the African American civil rights movement, led the postwar antinuclear campaign, played a major role in the movement against the war in Vietnam, helped to move the liberal establishment (briefly) toward a more dovish foreign policy in the early 1970s, and helped to shape the political culture of American radicalism. Despite these achievements, the peace movement never regained the political legitimacy and prestige it held in the years before World War II, and it struggled with internal divisions about ideology, priorities, and tactics.
Peace activist histories in the 20th century tended to emphasize organizational or biographical approaches that sometimes carried hagiographic overtones. More recently, historians have applied the methods of cultural history, examining the role of religion, gender, and race in structuring peace activism. The transnational and global turn in the historical discipline has also begun to make inroads in peace scholarship. These are promising new directions because they situate peace activism within larger historical and cultural developments and relate peace history to broader historiographical debates and trends.
The reproductive experiences of women and girls in the 20th-century United States followed historical patterns shaped by the politics of race and class. Laws and policies governing reproduction generally regarded white women as legitimate reproducers and potentially fit mothers and defined women of color as unfit for reproduction and motherhood; regulations provided for rewards and punishments accordingly. In addition, public policy and public rhetoric defined “population control” as the solution to a variety of social and political problems in the United States, including poverty, immigration, the “quality” of the population, environmental degradation, and “overpopulation.” Throughout the century, nonetheless, women, communities of color, and impoverished persons challenged official efforts, at times reducing or even eliminating barriers to reproductive freedom and community survival.
Between 1900 and 1930, decades marked by increasing urbanization, industrialization, and immigration, eugenic fears of “race suicide” (concerns that white women were not having enough babies) fueled a reproductive control regime that pressured middle-class white women to reproduce robustly. At the same time, the state enacted anti-immigrant laws, undermined the integrity of Native families, and protected various forms of racial segregation and white supremacy, all of which attacked the reproductive dignity of millions of women. Also in these decades, many African American women escaped the brutal and sexually predatory Jim Crow culture of the South, and middle-class white women gained greater sexual freedom and access to reproductive health care, including contraceptive services.
During the Great Depression, the government devised the Aid to Dependent Children program to provide destitute “worthy” white mothers with government aid while often denying such supports to women of color forced to subordinate their motherhood to agricultural and domestic labor. Following World War II, as the Civil Rights movement gathered form, focus, and adherents, and as African American and other women of color claimed their rights to motherhood and social provision, white policymakers railed against “welfare queens” and defined motherhood as a class privilege, suitable only for those who could afford to give their children “advantages.” The state, invoking the “population bomb,” fought to reduce the birth rates of poor women and women of color through sterilization and mandatory contraception, among other strategies. Between 1960 and 1980, white feminists employed the consumerist language of “choice” as part of the campaign for legalized abortion, even as Native, black, Latina, immigrant, and poor women struggled to secure the right to give birth to and raise their children with dignity and safety. The last decades of the 20th century saw severe cuts in social programs designed to aid low-income mothers and their children, cuts to funding for public education and housing, court decisions that dramatically reduced poor women’s access to reproductive health care including abortion, and the emergence of a powerful, often violent, anti-abortion movement. In response, in 1994 a group of women of color activists articulated the theory of reproductive justice, splicing together “social justice” and “reproductive rights.” The resulting Reproductive Justice movement, which would become increasingly influential in the 21st century, defined reproductive health, rights, and justice as human rights due to all persons and articulated what each individual requires to achieve these rights: the right not to have children, the right to have children, and the right to the social, economic, and environmental conditions necessary to raise children in healthy, peaceful, and sustainable households and communities.
Rosina A. Lozano
Language rights are an integral part of civil rights. They provide the tools that permit individuals to engage with and participate in society. The broad use of the Spanish language in the United States by both citizens and immigrants—it is the second-most-spoken language in the country by far—has a long history. Spanish was the first European governing language in parts of the future United States that included the Southwest, portions of the Louisiana Purchase, and Florida. The use of the language did not disappear when these regions became part of the United States, but rather persisted in some locales as a politically important language. In the 20th century, Spanish-speaking immigrants entered not just the Southwest and Florida, but also Chicago, New York, the South, Michigan, and other locales across the country in large numbers. Throughout the 20th century and into the 21st century, Spanish speakers and their advocates have reasserted their cultural preference by fighting for monolingual speakers’ right to use Spanish in legal settings, in public, as voters, as elected officials, at work, and in education. The politics of the Spanish language have only grown in importance as the largest influx of Spanish-speaking immigrants ever has entered the United States. This demographic shift makes the longer history of Spanish a crucial backstory for future language-policy decisions.
Maureen A. Flanagan
The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members.
Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.
Laura A. Belmonte
From the revolutionary era to the post-9/11 years, public and private actors have attempted to shape U.S. foreign relations by persuading mass audiences to embrace particular policies, people, and ways of life. Although the U.S. government conducted wartime propaganda activities prior to the 20th century, it had no official propaganda agency until the Committee on Public Information (CPI) was formed in 1917. For the next two years, CPI aimed to generate popular support for the United States and its allies in World War I. In 1938, as part of its Good Neighbor Policy, the Franklin Roosevelt administration launched official informational and cultural exchanges with Latin America. Following American entry into World War II, the U.S. government created a new propaganda agency, the Office of War Information (OWI). Like CPI, OWI was disbanded once hostilities ended. But in the fall of 1945, to combat the threats of anti-Americanism and communism, President Harry S. Truman broke with precedent and ordered the continuation of U.S. propaganda activities in peacetime. After several reorganizations within the Department of State, all U.S. cultural and information activities came under the purview of the newly created U.S. Information Agency (USIA) in 1953. Following the dissolution of USIA in 1999, the State Department reassumed authority over America’s international information and cultural programs through its Office of International Information Programs.
Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues.
Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them.
One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States
Joseph E. Hower
Government employees are an essential part of the early-21st-century labor movement in the United States. Teachers, firefighters, and police officers are among the most heavily unionized occupations in America, but public-sector union members also include street cleaners and nurses, janitors and librarians, zookeepers and engineers. Despite cultural stereotypes that continue to associate unions with steel or auto workers, public employees are five times more likely to be members of unions than workers in private industry. Today, nearly half of all union members work for federal, state, or local governments.
It was not always so. Despite a long, rich history of workplace and ballot box activism, government workers were marginal to the broader labor movement until the second half of the 20th century. Excluded from the legal breakthroughs that reshaped American industry in the 1930s, government workers lacked the basic organizing and bargaining rights extended to their private-sector counterparts. A complicated, and sometimes convoluted, combination of discourse and doctrine held that government employees were, as union leader Jerry Wurf later put it, a “servant to a master” rather than “a worker with a boss.” Inspired by the material success of workers in mass industry and moved by the moral clarity of the Black Freedom struggle, government workers demanded an end to their second-class status through one of the most consequential, and least recognized, social movements of late 20th century. Yet their success at improving the pay, benefits, and conditions of government work also increased the cost of government services, imposing new obligations at a time of dramatic change in the global economy. In the resulting crunch, unionized public workers came under political pressure, particularly from fiscal conservatives who charged that their bargaining rights and political power were incompatible with a new age of austerity and limits.
Radicalism in the United States since 1945 has been varied, complex, and often fragmented, making it difficult to analyze as a coherent movement. Communist and pro-Soviet organizations remained active after World War II, but a proliferation of noncommunist groups in the 1940s and 1950s, formed by those disillusioned by Marxist theory or the Soviet Union, began to chart a new course for the American Left. Eschewing much of the previous focus on labor, the proletariat, and Marxist doctrine, American postwar radical organizations realigned around humanist values, moral action, democracy, and even religion, with tenuous connections to Marxism, if any. The parameters of postwar radical moral theory were not always clearly defined, and questions of strategy and vision caused frequent divisions among activists. Nonetheless, claims of individual dignity and freedom continued to frame left radicalism into the late 20th century, emphasizing identity politics, community-building initiatives, and cultural expression in the streets of U.S. cities and the halls of academia. The presidential campaign of Bernie Sanders in 2016 helped revitalize leftist rhetoric on the national stage with its calls for racial and economic equality on moral terms.
From the founding of the American republic through the 19th century, the nation’s environmental policy mostly centered on promoting American settlers’ conquest of the frontier. Early federal interventions, whether railroad and canal subsidies or land grant acts, led to rapid transformations of the natural environment that inspired a conservation movement by the end of the 19th century. Led by activists and policymakers, this movement sought to protect America’s resources now jeopardized by expansive industrial infrastructure. During the Gilded Age, the federal government established the world’s first national parks, and in the Progressive Era, politicians such as President Theodore Roosevelt called for the federal government to play a central role in ensuring the efficient utilization of the nation’s ecological bounty. By the early 1900s, conservationists established new government agencies, such as the U.S. Forest Service and the Bureau of Reclamation, to regulate the consumption of trees, water, and other valuable natural assets. Wise-use was the watchword of the day, with environmental managers in DC’s bureaucracy focused mainly on protecting the economic value latent in America’s ecosystems. However, other groups, such as the Wilderness Society, proved successful at redirecting policy prescriptions toward preserving beautiful and wild spaces, not just conserving resources central to capitalist enterprise. In the 1960s and 1970s, suburban and urban environmental activists attracted federal regulators’ attention to contaminated soil and water under their feet. The era of ecology had arrived, and the federal government now had broad powers through the Environmental Protection Agency (EPA) to manage ecosystems that stretched across the continent. But from the 1980s to the 2010s, the federal government’s authority to regulate the environment waxed and waned as economic crises, often exacerbated by oil shortages, brought environmental agencies under fire. The Rooseveltian logic of the Progressive Era, which said that America’s economic growth depended on federal oversight of the environment, came under assault from neoliberal disciples of Ronald Reagan, who argued that environmental regulations were in fact the root cause of economic stagnation in America, not a powerful prescription against it. What the country needed, according to the reformers of the New Right, was unregulated expansion into new frontiers. By the 2010s, the contours of these new frontiers were clear: deep-water oil drilling, Bakken shale exploration, and tar-sand excavation in Alberta, Canada. In many ways, the frontier conquest doctrine of colonial Americans found new life in deregulatory U.S. environmental policy pitched by conservatives in the wake of the Reagan Revolution. Never wholly dominant, this ethos carried on into the era of Donald Trump’s presidency.
In 1835, Alexis de Tocqueville argued in Democracy in America that there were “two great nations in the world.” They had started from different historical points but seemed to be heading in the same direction. As expanding empires, they faced the challenges of defeating nature and constructing a civilization for the modern era. Although they adhered to different governmental systems, “each of them,” de Tocqueville declared, “seems marked out by the will of Heaven to sway the destinies of half the globe.”
De Tocqueville’s words were prophetic. In the 19th century, Russian and American intellectuals and diplomats struggled to understand the roles that their countries should play in the new era of globalization and industrialization. Despite their differing understandings of how development should happen, both sides believed in their nation’s vital role in guiding the rest of the world. American adherents of liberal developmentalism often argued that a free flow of enterprise, trade, investment, information, and culture was the key to future growth. They held that the primary obligation of American foreign policy was to defend that freedom by pursuing an “open door” policy and free access to markets. They believed that the American model would work for everyone and that the United States had an obligation to share its system with the old and underdeveloped nations around it.
A similar sense of mission developed in Russia. Russian diplomats had for centuries struggled to establish defensive buffers around the periphery of their empire. They had linked economic development to national security, and they had argued that their geographic expansion represented a “unification” of peoples as opposed to a conquering of them. In the 19th century, after the Napoleonic Wars and the failed Decembrist Revolution, tsarist policymakers fought to defend autocracy, orthodoxy, and nationalism from domestic and international critics. As in the United States, Imperial and later Soviet leaders envisioned themselves as the emissaries of the Enlightenment to the backward East and as protectors of tradition and order for the chaotic and revolutionary West.
These visions of order clashed in the 20th century as the Soviet Union and the United States became superpowers. Conflicts began early, with the American intervention in the 1918–1921 Russian civil war. Tensions that had previously been based on differing geographic and strategic interests then assumed an ideological valence, as the fight between East and West became a struggle between the political economies of communism and capitalism. Foreign relations between the two countries experienced boom and bust cycles that took the world to the brink of nuclear holocaust and yet maintained a strategic balance that precluded the outbreak of global war for fifty years. This article will examine how that relationship evolved and how it shaped the modern world.
Steven K. Green
Separation of church and state has long been viewed as a cornerstone of American democracy. At the same time, the concept has remained highly controversial in the popular culture and law. Much of the debate over the application and meaning of the phrase focuses on its historical antecedents. This article briefly examines the historical origins of the concept and its subsequent evolutions in the nineteenth century.