61-80 of 111 Results  for:

  • Political History x
Clear all

Article

Susan Colbourn

On April 4, 1949, twelve nations signed the North Atlantic Treaty: the United States, Canada, Iceland, the United Kingdom, Belgium, the Netherlands, Luxembourg, France, Portugal, Italy, Norway, and Denmark. For the United States, the North Atlantic Treaty signaled a major shift in foreign policy. Gone was the traditional aversion to “entangling alliances,” dating back to George Washington’s farewell address. The United States had entered into a collective security arrangement designed to preserve peace in Europe. With the creation of the North Atlantic Treaty Organization (NATO), the United States took on a clear leadership role on the European continent. Allied defense depended on US military power, most notably the nuclear umbrella. Reliance on the United States unsurprisingly created problems. Doubts about the strength of the transatlantic partnership and rumors of a NATO in shambles were (and are) commonplace, as were anxieties about the West’s strength in comparison to NATO’s Eastern counterpart, the Warsaw Pact. NATO, it turned out, was more than a Cold War institution. After the fall of the Berlin Wall and the collapse of the Soviet Union, the Alliance remained vital to US foreign policy objectives. The only invocation of Article V, the North Atlantic Treaty’s collective defense clause, came in the wake of the September 11, 2001 terrorist attacks. Over the last seven decades, NATO has symbolized both US power and its challenges.

Article

Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.

Article

The relationship between organized labor and the civil rights movement proceeded along two tracks. At work, the two groups were adversaries, as civil rights groups criticized employment discrimination by the unions. But in politics, they allied. Unions and civil rights organizations partnered to support liberal legislation and to oppose conservative southern Democrats, who were as militant in opposing unions as they were fervent in supporting white supremacy. At work, unions dithered in their efforts to root out employment discrimination. Their initial enthusiasm for Title VII of the 1964 Civil Rights Act, which outlawed employment discrimination, waned the more the new law violated foundational union practices by infringing on the principle of seniority, emphasizing the rights of the individual over the group, and inserting the courts into the workplace. The two souls of postwar liberalism— labor solidarity represented by unions and racial justice represented by the civil rights movement—were in conflict at work. Although the unions and civil rights activists were adversaries over employment discrimination, they united in trying to register southern blacks to vote. Black enfranchisement would end the South’s exceptionalism and the veto it exercised over liberal legislation in Congress. But the two souls of liberalism that were at odds over the meaning of fairness at work would also diverge at the ballot box. As white workers began to defect from the Democratic Party, the political coalition of black and white workers that union leaders had hoped to build was undermined from below. The divergence between the two souls of liberalism in the 1960s—economic justice represented by unions and racial justice represented by civil rights—helps explain the resurgence of conservatism that followed.

Article

Jessica M. Chapman

The origins of the Vietnam War can be traced to France’s colonization of Indochina in the late 1880s. The Viet Minh, led by Ho Chi Minh, emerged as the dominant anti-colonial movement by the end of World War II, though Viet Minh leaders encountered difficulties as they tried to consolidate their power on the eve of the First Indochina War against France. While that war was, initially, a war of decolonization, it became a central battleground of the Cold War by 1950. The lines of future conflict were drawn that year when the Peoples Republic of China and the Soviet Union recognized and provided aid to the Democratic Republic of Vietnam in Hanoi, followed almost immediately by Washington’s recognition of the State of Vietnam in Saigon. From that point on, American involvement in Vietnam was most often explained in terms of the Domino Theory, articulated by President Dwight D. Eisenhower on the eve of the Geneva Conference of 1954. The Franco-Viet Minh ceasefire reached at Geneva divided Vietnam in two at the 17th parallel, with countrywide reunification elections slated for the summer of 1956. However, the United States and its client, Ngo Dinh Diem, refused to participate in talks preparatory to those elections, preferring instead to build South Vietnam as a non-communist bastion. While the Vietnamese communist party, known as the Vietnam Worker’s Party in Hanoi, initially hoped to reunify the country by peaceful means, it reached the conclusion by 1959 that violent revolution would be necessary to bring down the “American imperialists and their lackeys.” In 1960, the party formed the National Liberation Front for Vietnam and, following Diem’s assassination in 1963, passed a resolution to wage all-out war in the south in an effort to claim victory before the United States committed combat troops. After President John F. Kennedy took office in 1961, he responded to deteriorating conditions in South Vietnam by militarizing the American commitment, though he stopped short of introducing dedicated ground troops. After Diem and Kennedy were assassinated in quick succession in November 1963, Lyndon Baines Johnson took office determined to avoid defeat in Vietnam, but hoping to prevent the issue from interfering with his domestic political agenda. As the situation in South Vietnam became more dire, LBJ found himself unable to maintain the middle-of-the-road approach that Kennedy had pursued. Forced to choose between escalation and withdrawal, he chose the former in March 1965 by launching a sustained campaign of aerial bombardment, coupled with the introduction of the first officially designated U.S. combat forces to Vietnam.

Article

Leilah Danielson

Peace activism in the United States between 1945 and the 2010s focused mostly on opposition to U.S. foreign policy, efforts to strengthen and foster international cooperation, and support for nuclear nonproliferation and arms control. The onset of the Cold War between the United States and the Soviet Union marginalized a reviving postwar American peace movement emerging from concerns about atomic and nuclear power and worldwide nationalist politics that everywhere seemed to foster conflict, not peace. Still, peace activism continued to evolve in dynamic ways and to influence domestic politics and international relations. Most significantly, peace activists pioneered the use of Gandhian nonviolence in the United States and provided critical assistance to the African American civil rights movement, led the postwar antinuclear campaign, played a major role in the movement against the war in Vietnam, helped to move the liberal establishment (briefly) toward a more dovish foreign policy in the early 1970s, and helped to shape the political culture of American radicalism. Despite these achievements, the peace movement never regained the political legitimacy and prestige it held in the years before World War II, and it struggled with internal divisions about ideology, priorities, and tactics. Peace activist histories in the 20th century tended to emphasize organizational or biographical approaches that sometimes carried hagiographic overtones. More recently, historians have applied the methods of cultural history, examining the role of religion, gender, and race in structuring peace activism. The transnational and global turn in the historical discipline has also begun to make inroads in peace scholarship. These are promising new directions because they situate peace activism within larger historical and cultural developments and relate peace history to broader historiographical debates and trends.

Article

Patricio N. Abinales

An enduring resilience characterizes Philippine–American relationship for several reasons. For one, there is an unusual colonial relationship wherein the United States took control of the Philippines from the Spanish and then shared power with an emergent Filipino elite, introduced suffrage, implemented public education, and promised eventual national independence. A shared experience fighting the Japanese in World War II and defeating a postwar communist rebellion further cemented the “special relationship” between the two countries. The United States took advantage of this partnership to compel the Philippines to sign an economic and military treaty that favored American businesses and the military, respectively. Filipino leaders not only accepted the realities of this strategic game and exploited every opening to assert national interests but also benefitted from American largesse. Under the dictatorship of President Ferdinand Marcos, this mutual cadging was at its most brazen. As a result, the military alliance suffered when the Philippines terminated the agreement, and the United States considerably reduced its support to the country. But the estrangement did not last long, and both countries rekindled the “special relationship” in response to the U.S. “Global War on Terror” and, of late, Chinese military aggression in the West Philippine Sea.

Article

Historians of colonial British North America have largely relegated piracy to the marginalia of the broad historical narrative from settlement to revolution. However, piracy and unregulated privateering played a pivotal role in the development of every English community along the eastern seaboard from the Carolinas to New England. Although many pirates originated in the British North American colonies and represented a diverse social spectrum, they were not supported and protected in these port communities by some underclass or proto-proletariat but by the highest echelons of colonial society, especially by colonial governors, merchants, and even ministers. Sea marauding in its multiple forms helped shape the economic, legal, political, religious, and cultural worlds of colonial America. The illicit market that brought longed-for bullion, slaves, and luxury goods integrated British North American communities with the Caribbean, West Africa, and the Pacific and Indian Oceans throughout the 17th century. Attempts to curb the support of sea marauding at the turn of the 18th century exposed sometimes violent divisions between local merchant interests and royal officials currying favor back in England, leading to debates over the protection of English liberties across the Atlantic. When the North American colonies finally closed their ports to English pirates during the years following the Treaty of Utrecht (1713), it sparked a brief yet dramatic turn of events where English marauders preyed upon the shipping belonging to their former “nests.” During the 18th century, colonial communities began to actively support a more regulated form of privateering against agreed upon enemies that would become a hallmark of patriot maritime warfare during the American Revolution.

Article

The reproductive experiences of women and girls in the 20th-century United States followed historical patterns shaped by the politics of race and class. Laws and policies governing reproduction generally regarded white women as legitimate reproducers and potentially fit mothers and defined women of color as unfit for reproduction and motherhood; regulations provided for rewards and punishments accordingly. In addition, public policy and public rhetoric defined “population control” as the solution to a variety of social and political problems in the United States, including poverty, immigration, the “quality” of the population, environmental degradation, and “overpopulation.” Throughout the century, nonetheless, women, communities of color, and impoverished persons challenged official efforts, at times reducing or even eliminating barriers to reproductive freedom and community survival. Between 1900 and 1930, decades marked by increasing urbanization, industrialization, and immigration, eugenic fears of “race suicide” (concerns that white women were not having enough babies) fueled a reproductive control regime that pressured middle-class white women to reproduce robustly. At the same time, the state enacted anti-immigrant laws, undermined the integrity of Native families, and protected various forms of racial segregation and white supremacy, all of which attacked the reproductive dignity of millions of women. Also in these decades, many African American women escaped the brutal and sexually predatory Jim Crow culture of the South, and middle-class white women gained greater sexual freedom and access to reproductive health care, including contraceptive services. During the Great Depression, the government devised the Aid to Dependent Children program to provide destitute “worthy” white mothers with government aid while often denying such supports to women of color forced to subordinate their motherhood to agricultural and domestic labor. Following World War II, as the Civil Rights movement gathered form, focus, and adherents, and as African American and other women of color claimed their rights to motherhood and social provision, white policymakers railed against “welfare queens” and defined motherhood as a class privilege, suitable only for those who could afford to give their children “advantages.” The state, invoking the “population bomb,” fought to reduce the birth rates of poor women and women of color through sterilization and mandatory contraception, among other strategies. Between 1960 and 1980, white feminists employed the consumerist language of “choice” as part of the campaign for legalized abortion, even as Native, black, Latina, immigrant, and poor women struggled to secure the right to give birth to and raise their children with dignity and safety. The last decades of the 20th century saw severe cuts in social programs designed to aid low-income mothers and their children, cuts to funding for public education and housing, court decisions that dramatically reduced poor women’s access to reproductive health care including abortion, and the emergence of a powerful, often violent, anti-abortion movement. In response, in 1994 a group of women of color activists articulated the theory of reproductive justice, splicing together “social justice” and “reproductive rights.” The resulting Reproductive Justice movement, which would become increasingly influential in the 21st century, defined reproductive health, rights, and justice as human rights due to all persons and articulated what each individual requires to achieve these rights: the right not to have children, the right to have children, and the right to the social, economic, and environmental conditions necessary to raise children in healthy, peaceful, and sustainable households and communities.

Article

Language rights are an integral part of civil rights. They provide the tools that permit individuals to engage with and participate in society. The broad use of the Spanish language in the United States by both citizens and immigrants—it is the second-most-spoken language in the country by far—has a long history. Spanish was the first European governing language in parts of the future United States that included the Southwest, portions of the Louisiana Purchase, and Florida. The use of the language did not disappear when these regions became part of the United States, but rather persisted in some locales as a politically important language. In the 20th century, Spanish-speaking immigrants entered not just the Southwest and Florida, but also Chicago, New York, the South, Michigan, and other locales across the country in large numbers. Throughout the 20th century and into the 21st century, Spanish speakers and their advocates have reasserted their cultural preference by fighting for monolingual speakers’ right to use Spanish in legal settings, in public, as voters, as elected officials, at work, and in education. The politics of the Spanish language have only grown in importance as the largest influx of Spanish-speaking immigrants ever has entered the United States. This demographic shift makes the longer history of Spanish a crucial backstory for future language-policy decisions.

Article

The People’s (or Populist) Party represented the last major third-party effort to prevent the emergence of large-scale corporate capitalism in the United States. Founded in 1891, the party sought to unite the producers of wealth—farmers and workers—into a political coalition dedicated to breaking the hold of private bankers over the nation’s monetary system, controlling monopolies through government ownership, and opening up unused land to actual settlers. Industrial workers and their unions were initially wary of the new party, but things changed after the traumatic labor unrest of 1894: Coxey’s March, the nationwide coal strike, and the Pullman boycott. At that time, the American Federation of Labor (AFL) debated some form of alliance with the Populists. Although the Federation rejected such an alliance in both 1894 and 1895 by the slimmest of margins, it did elect a labor Populist—John McBride of the United Mine Workers of America (UMWA)—to the presidency in 1894. This Populist insurgency represents the closest that the main body of the nation’s labor movement ever came to forming a labor party resembling those that arose in industrialized Europe, and its failure helps explain why American workers were unable to mobilize politically to challenge the emerging economic order dominated by large corporate enterprises. While the agrarian leaders of the People’s Party at first sought the backing of industrial workers, especially those associated with the AFL, they shunned labor’s support after the trauma of 1894. Party officials like Herman Taubeneck, James Weaver, and Tom Watson feared that labor’s support would taint the party with radicalism and violence, warned that trade unionists sought to control the party, and took steps designed to alienate industrial workers. They even justified their retreat from the broad-based Omaha Platform (1892) on the grounds that it would drive the trade unionists they called “socialists” from the party.

Article

The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members. Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.

Article

From the revolutionary era to the post-9/11 years, public and private actors have attempted to shape U.S. foreign relations by persuading mass audiences to embrace particular policies, people, and ways of life. Although the U.S. government conducted wartime propaganda activities prior to the 20th century, it had no official propaganda agency until the Committee on Public Information (CPI) was formed in 1917. For the next two years, CPI aimed to generate popular support for the United States and its allies in World War I. In 1938, as part of its Good Neighbor Policy, the Franklin Roosevelt administration launched official informational and cultural exchanges with Latin America. Following American entry into World War II, the U.S. government created a new propaganda agency, the Office of War Information (OWI). Like CPI, OWI was disbanded once hostilities ended. But in the fall of 1945, to combat the threats of anti-Americanism and communism, President Harry S. Truman broke with precedent and ordered the continuation of U.S. propaganda activities in peacetime. After several reorganizations within the Department of State, all U.S. cultural and information activities came under the purview of the newly created U.S. Information Agency (USIA) in 1953. Following the dissolution of USIA in 1999, the State Department reassumed authority over America’s international information and cultural programs through its Office of International Information Programs.

Article

Gail Radford

Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues. Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them. One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States

Article

Joseph E. Hower

Government employees are an essential part of the early-21st-century labor movement in the United States. Teachers, firefighters, and police officers are among the most heavily unionized occupations in America, but public-sector union members also include street cleaners and nurses, janitors and librarians, zookeepers and engineers. Despite cultural stereotypes that continue to associate unions with steel or auto workers, public employees are five times more likely to be members of unions than workers in private industry. Today, nearly half of all union members work for federal, state, or local governments. It was not always so. Despite a long, rich history of workplace and ballot box activism, government workers were marginal to the broader labor movement until the second half of the 20th century. Excluded from the legal breakthroughs that reshaped American industry in the 1930s, government workers lacked the basic organizing and bargaining rights extended to their private-sector counterparts. A complicated, and sometimes convoluted, combination of discourse and doctrine held that government employees were, as union leader Jerry Wurf later put it, a “servant to a master” rather than “a worker with a boss.” Inspired by the material success of workers in mass industry and moved by the moral clarity of the Black Freedom struggle, government workers demanded an end to their second-class status through one of the most consequential, and least recognized, social movements of late 20th century. Yet their success at improving the pay, benefits, and conditions of government work also increased the cost of government services, imposing new obligations at a time of dramatic change in the global economy. In the resulting crunch, unionized public workers came under political pressure, particularly from fiscal conservatives who charged that their bargaining rights and political power were incompatible with a new age of austerity and limits.

Article

Adrian Chastain Weimer

Founded in the late 1640s, Quakerism reached America in the 1650s and quickly took root due to the determined work of itinerant missionaries over the next several decades. Quakers, or members of the Society of Friends, faced different legal and social challenges in each colony. Many English men and women viewed Friends with hostility because they refused to bear arms in a colony’s defense or take loyalty oaths. Others were drawn to Quakers’ egalitarian message of universal access to the light of Christ in each human being. After George Fox’s visit to the West Indies and the mainland colonies in 1671–1672, Quaker missionaries followed his lead in trying to include enslaved Africans and native Americans in their meetings. Itinerant Friends were drawn to colonies with the most severe laws, seeking a public platform from which to display, through suffering, a joyful witness to the truth of the Quaker message. English Quakers then quickly ushered accounts of their sufferings into print. Organized and supported by English Quakers such as Margaret Fell, the Quaker “invasion” of itinerant missionaries put pressure on colonial judicial systems to define the acceptable boundaries for dissent. Nascent communities of Friends from Barbados to New England struggled with the tension between Quaker ideals and the economic and social hierarchies of colonial societies.

Article

Radicalism in the United States since 1945 has been varied, complex, and often fragmented, making it difficult to analyze as a coherent movement. Communist and pro-Soviet organizations remained active after World War II, but a proliferation of noncommunist groups in the 1940s and 1950s, formed by those disillusioned by Marxist theory or the Soviet Union, began to chart a new course for the American Left. Eschewing much of the previous focus on labor, the proletariat, and Marxist doctrine, American postwar radical organizations realigned around humanist values, moral action, democracy, and even religion, with tenuous connections to Marxism, if any. The parameters of postwar radical moral theory were not always clearly defined, and questions of strategy and vision caused frequent divisions among activists. Nonetheless, claims of individual dignity and freedom continued to frame left radicalism into the late 20th century, emphasizing identity politics, community-building initiatives, and cultural expression in the streets of U.S. cities and the halls of academia. The presidential campaign of Bernie Sanders in 2016 helped revitalize leftist rhetoric on the national stage with its calls for racial and economic equality on moral terms.

Article

Ronald Reagan’s foreign policy legacy remains hotly contested, and as new archival sources come to light, those debates are more likely to intensify than to recede into the background. In dealings with the Soviet Union, the Reagan administration set the superpowers on a course for the (largely) peaceful end of the Cold War. Reagan began his outreach to Soviet leaders almost immediately after taking office and enjoyed some success, even if the dominant theme of the period remains fears of Reagan as a “button-pusher” in the public’s perception. Mikhail Gorbachev’s election to the post of General Secretary proved the turning point. Reagan, now confident in US strength, and Gorbachev, keen to reduce the financial burden of the arms race, ushered in a new, cooperative phase of the Cold War. Elsewhere, in particular Latin America, the administration’s focus on fighting communism led it to support human rights–abusing regimes at the same time as it lambasted Moscow’s transgressions in that regard. But even so, over the course of the 1980s, the United States began pushing for democratization around the world, even where Reagan and his advisors had initially resisted it, fearing a communist takeover. In part, this was a result of public pressure, but the White House recognized and came to support the rising tide of democratization. When Reagan left office, a great many countries that had been authoritarian were no longer, often at least in part because of US policy. US–Soviet relations had improved to such an extent that Reagan’s successor, Vice President George H. W. Bush, worried that they had gone too far in working with Gorbachev and been hoodwinked.

Article

From the founding of the American republic through the 19th century, the nation’s environmental policy mostly centered on promoting American settlers’ conquest of the frontier. Early federal interventions, whether railroad and canal subsidies or land grant acts, led to rapid transformations of the natural environment that inspired a conservation movement by the end of the 19th century. Led by activists and policymakers, this movement sought to protect America’s resources now jeopardized by expansive industrial infrastructure. During the Gilded Age, the federal government established the world’s first national parks, and in the Progressive Era, politicians such as President Theodore Roosevelt called for the federal government to play a central role in ensuring the efficient utilization of the nation’s ecological bounty. By the early 1900s, conservationists established new government agencies, such as the U.S. Forest Service and the Bureau of Reclamation, to regulate the consumption of trees, water, and other valuable natural assets. Wise-use was the watchword of the day, with environmental managers in DC’s bureaucracy focused mainly on protecting the economic value latent in America’s ecosystems. However, other groups, such as the Wilderness Society, proved successful at redirecting policy prescriptions toward preserving beautiful and wild spaces, not just conserving resources central to capitalist enterprise. In the 1960s and 1970s, suburban and urban environmental activists attracted federal regulators’ attention to contaminated soil and water under their feet. The era of ecology had arrived, and the federal government now had broad powers through the Environmental Protection Agency (EPA) to manage ecosystems that stretched across the continent. But from the 1980s to the 2010s, the federal government’s authority to regulate the environment waxed and waned as economic crises, often exacerbated by oil shortages, brought environmental agencies under fire. The Rooseveltian logic of the Progressive Era, which said that America’s economic growth depended on federal oversight of the environment, came under assault from neoliberal disciples of Ronald Reagan, who argued that environmental regulations were in fact the root cause of economic stagnation in America, not a powerful prescription against it. What the country needed, according to the reformers of the New Right, was unregulated expansion into new frontiers. By the 2010s, the contours of these new frontiers were clear: deep-water oil drilling, Bakken shale exploration, and tar-sand excavation in Alberta, Canada. In many ways, the frontier conquest doctrine of colonial Americans found new life in deregulatory U.S. environmental policy pitched by conservatives in the wake of the Reagan Revolution. Never wholly dominant, this ethos carried on into the era of Donald Trump’s presidency.

Article

While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.

Article

In 1835, Alexis de Tocqueville argued in Democracy in America that there were “two great nations in the world.” They had started from different historical points but seemed to be heading in the same direction. As expanding empires, they faced the challenges of defeating nature and constructing a civilization for the modern era. Although they adhered to different governmental systems, “each of them,” de Tocqueville declared, “seems marked out by the will of Heaven to sway the destinies of half the globe.” De Tocqueville’s words were prophetic. In the 19th century, Russian and American intellectuals and diplomats struggled to understand the roles that their countries should play in the new era of globalization and industrialization. Despite their differing understandings of how development should happen, both sides believed in their nation’s vital role in guiding the rest of the world. American adherents of liberal developmentalism often argued that a free flow of enterprise, trade, investment, information, and culture was the key to future growth. They held that the primary obligation of American foreign policy was to defend that freedom by pursuing an “open door” policy and free access to markets. They believed that the American model would work for everyone and that the United States had an obligation to share its system with the old and underdeveloped nations around it. A similar sense of mission developed in Russia. Russian diplomats had for centuries struggled to establish defensive buffers around the periphery of their empire. They had linked economic development to national security, and they had argued that their geographic expansion represented a “unification” of peoples as opposed to a conquering of them. In the 19th century, after the Napoleonic Wars and the failed Decembrist Revolution, tsarist policymakers fought to defend autocracy, orthodoxy, and nationalism from domestic and international critics. As in the United States, Imperial and later Soviet leaders envisioned themselves as the emissaries of the Enlightenment to the backward East and as protectors of tradition and order for the chaotic and revolutionary West. These visions of order clashed in the 20th century as the Soviet Union and the United States became superpowers. Conflicts began early, with the American intervention in the 1918–1921 Russian civil war. Tensions that had previously been based on differing geographic and strategic interests then assumed an ideological valence, as the fight between East and West became a struggle between the political economies of communism and capitalism. Foreign relations between the two countries experienced boom and bust cycles that took the world to the brink of nuclear holocaust and yet maintained a strategic balance that precluded the outbreak of global war for fifty years. This article will examine how that relationship evolved and how it shaped the modern world.