You are looking at 161-180 of 353 articles
Benjamin H. Johnson
When rebels captured the border city of Juárez, Mexico, in May 1911 and forced the abdication of President Porfirio Díaz shortly thereafter, they not only overthrew the western hemisphere’s oldest regime but also inaugurated the first social revolution of the 20th century. Driven by disenchantment with an authoritarian regime that catered to foreign investment, labor exploitation, and landlessness, revolutionaries dislodged Díaz’s regime, crushed an effort to resurrect it, and then spent the rest of the decade fighting one another for control of the nation. This struggle, recognized ever since as foundational for Mexican politics and identity, also had enormous consequences for the ethnic makeup, border policing, and foreign policy of the United States. Over a million Mexicans fled north during the 1910s, perhaps tripling the country’s Mexican-descent population, most visibly in places such as Los Angeles that had become overwhelmingly Anglo-American. US forces occupied Mexican territory twice, nearly bringing the two nations to outright warfare for the first time since the US–Mexican War of 1846–1848. Moreover, revolutionary violence and radicalism transformed the ways that much of the American population and its government perceived their border with Mexico, providing a rationale for a much more highly policed border and for the increasingly brutal treatment of Mexican-descent people in the United States. The Mexican Revolution was a turning point for Mexico, the United States, and their shared border, and for all who crossed it.
The military history of the American Revolution is more than the history of the War of Independence. The Revolution itself had important military causes. The experience of the Seven Years’ War (which started in 1754 in North America) conditioned British attitudes to the colonies after that conflict was over. From 1764, the British Parliament tried to raise taxes in America to pay for a new permanent military garrison. British politicians resisted colonial objections to parliamentary taxation at least partly because they feared that if the Americans established their right not to be taxed by Westminster, Parliament’s right to regulate colonial overseas trade would then be challenged. If the Americans broke out of the system of trade regulation, British ministers, MPs, and peers worried, then the Royal Navy would be seriously weakened.
The War of Independence, which began in 1775, was not the great American triumph that most accounts suggest. The British army faced a difficult task in suppressing a rebellion three thousand miles from Britain itself. French intervention on the American side in 1778 (followed by the Spanish in 1779, and the Dutch in 1780) made the task still more difficult. In the end, the war in America was won by the French as much as by the Americans. But in the wider imperial conflict, affecting the Caribbean, Central America, Europe, West Africa, and South Asia, the British fared much better. Even in its American dimension, the outcome was less clear cut than we usually imagine. The British, the nominal losers, retained great influence in the independent United States, which in economic terms remained in an essentially dependent relationship with the former mother country.
The relationship between the Church of Jesus Christ of Latter-day Saints—commonly called “Mormonism”—and the politics and culture of the United States is both contentious and intertwined. Historians have commonly observed that Mormonism is in many ways quintessentially American, bearing the marks of the Jacksonian period in which it was born. Its rejection of the denominational leadership of its day, its institution of a lay priesthood, and Joseph Smith’s insistence that revelation trumped scholarship and study all marked it as very much of its time and place, an America in which the authority of common people was exalted and tradition authority was suspect. And yet at the same time, Mormonism was suspect almost immediately upon its birth for those things that made it appear distinctly un-American: the divine power of its prophetic leaders, its rejection of the sole authority of the Bible, its clannishness and separatism, and its defiance of 19th-century sexual morality.
The history of Mormonism in America is in many ways a tug of war between these two impulses. At times the Mormons have embraced what makes them American, have proudly claimed elements of national identity, and have claimed that their faith most truly embodies the American creed. At other times, however, either because of hostility from other Americans or because of their own separatism, Mormons have distanced themselves from the national community and sought a separate community and peoplehood. Through the 19th century, because of the practice of polygamy and the theocratic government of the Utah territory, both Mormons and other Americans perceived a gap between their two communities, but that gap closed by the end of the century, when the federal government used force to eliminate those things Americans most objected to about the faith and Mormons began aggressively pursuing assimilation into American life. By the end of the 20th century, however, Mormonism’s cultural conservatism led both Mormons and other Americans to see that gap opening once more.
The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
The history of Muslims in America dates back to the transatlantic mercantile interactions between Europe, Africa, and the Americas. Upon its arrival, Islam became entrenched in American discourses on race and civilization because literate and noble African Muslims, brought to America as slaves, had problematized popular stereotypes of Muslims and black Africans. Furthermore, these enslaved Muslims had to re-evaluate and reconfigure their beliefs and practices to form new communal relations and to make sense of their lives in America.
At the turn of the 20th century, as Muslim immigrants began arriving in the United States from the Middle East, Eastern Europe, and South Asia, they had to establish themselves in an America in which the white race, Protestantism, and progress were conflated to define a triumphalist American national identity, one that allowed varying levels of inclusion for Muslims based on their ethnic, racial, and national backgrounds.
The enormous bloodshed and destruction experienced during World War I ushered in a crisis of confidence in the ideals of the European Enlightenment, as well as in white, Protestant nationalism. It opened up avenues for alternative expressions of progress, which allowed Muslims, along with other nonwhite, non-Christian communities, to engage in political and social organization. Among these organizations were a number of black religious movements that used Islamic beliefs, rites, and symbols to define a black Muslim national identity.
World War II further shifted America, away from the religious competition that had earlier defined the nation’s identity and toward a “civil religion” of American democratic values and political institutions. Although this inclusive rhetoric was received differently along racial and ethnic lines, there was an overall appeal for greater visibility for Muslims in America. After World War II, increased commercial and diplomatic relations between the United States and Muslim-majority countries put American Muslims in a position, not only to relate Islam and America in their own lives but also to mediate between the varying interests of Muslim-majority countries and the United States.
Following the civil rights legislation of the 1950s and 1960s and the passage of the Immigration Act of 1965, Muslim activists, many of whom had been politicized by anticolonial movements abroad, established new Islamic institutions. Eventually, a window was opened between the US government and American Muslim activists, who found a common enemy in communism following the Soviet occupation of Afghanistan in the 1980s.
Since the late 1960s, the number of Muslims in the United States has grown significantly. Today, Muslims are estimated to constitute a little more than 1 percent of the US population. However, with the fall of the Soviet Union and the rise of the United States as the sole superpower in the world, the United States has come into military conflict with Muslim-majority countries and has been the target of attacks by militant Muslim organizations. This has led to the cultivation of the binaries of “Islam and the West” and of “good” Islam and “bad” Islam, which have contributed to the racialization of American Muslims. It has also interpolated them into a reality external to their history and lived experiences as Muslims and Americans.
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Urban renewal refers to an interlocking set of national and local policies, programs, and projects, implemented in the vast majority of American cities between 1949 and 1973. These typically entailed major redevelopment of existing urban areas with a view to the modernization of housing, highway infrastructure, commercial and business districts, as well as other large-scale constructions. Reformers from the Progressive Era through the Great Society strove to ameliorate the conditions of poverty and inequality in American cities by focusing primarily on physical transformation of the urban built environment. Citing antecedents such as the reconstruction of Second Empire Paris, imported via the City Beautiful movement, and then updated with midcentury modernism, US urban planners envisioned a radical reorganization of city life. In practice, federal programs and local public authorities targeted the eradication of areas deemed slums or blighted—often as much to socially sanitize neighborhoods inhabited by racial minorities and other marginalized groups as to address deteriorating physical conditions. And while federal funding became available for public works projects in declining central cities under the auspices of improving living conditions for the poor—including providing public housing—urban renewal programs consistently destroyed more affordable housing than they created, over more than three decades. By the end of the 1960s, urban residents and policymakers across the political spectrum concluded that such programs were usually doing more harm than good, and most ended during the Nixon administration. Yet large-scale reminders of urban renewal can still be found in most large US communities, whether in the form of mid-20th-century public housing blocks, transportation projects, stadiums, convention centers, university and hospital expansions, or a variety of public-private redevelopment initiatives. But perhaps the most fundamental legacies of all were the institutionalization of the comprehensive zoning and master planning process in cities nationwide, on the one hand, and the countervailing mobilization of defensively oriented (NIMBY) neighborhood politics, on the other.
Nicolas G. Rosenthal
An important relationship has existed between Native Americans and cities from pre-Columbian times to the early 21st century. Long before Europeans arrived in the Americas, indigenous peoples developed societies characterized by dense populations, large-scale agriculture, monumental architecture, and complex social hierarchies. Following European and American conquest and colonization, Native Americans played a crucial role in the development of towns and cities throughout North America, often on the site of former indigenous settlements.
Beginning in the early 20th century, Native Americans began migrating from reservations to U.S. cities in large numbers and formed new intertribal communities. By 1970, the majority of the Native American population lived in cities and the numbers of urban American Indians have been growing ever since. Indian Country in the early 21st century continues to be influenced by the complex and evolving ties between Native Americans and cities.
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”
In late 19th- and early 20th-century America, a new image of womanhood emerged that began to shape public views and understandings of women’s role in society.
Identified by contemporaries as a Gibson Girl, a suffragist, a Progressive reformer, a bohemian feminist, a college girl, a bicyclist, a flapper, a working-class militant, or a Hollywood vamp, all of these images came to epitomize the New Woman, an umbrella term for modern understandings of femininity. Referring both to real, flesh-and-blood women, and also to an abstract idea or a visual archetype, the New Woman represented a generation of women who came of age between 1890 and 1920 and challenged gender norms and structures by asserting a new public presence through work, education, entertainment, and politics, while also denoting a distinctly modern appearance that contrasted with Victorian ideals. The New Woman became associated with the rise of feminism and the campaign for women’s suffrage, as well as with the rise of consumerism, mass culture, and freer expressions of sexuality that defined the first decades of the 20th century. Emphasizing youth, mobility, freedom, and modernity, the image of the New Woman varied by age, class, race, ethnicity, and geographical region, offering a spectrum of behaviors and appearances with which different women could identify. At times controversial, the New Woman image provided women with opportunities to negotiate new social roles and to promote ideas of equality and freedom that would later become mainstream.
Luke A. Nichter
Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.
Nicole Etcheson and Cortney Cantrell
During the Civil War, the entire North constituted the homefront, an area largely removed from the din and horror of combat. With a few exceptions of raids and battles such as Gettysburg, civilians in the North experienced the war indirectly. The people on the homefront mobilized for war, sent their menfolk off to fight, supplied the soldiers and the army, coped without their breadwinners, and suffered the loss or maiming of men they loved. All the while, however, the homefront was crucially important to the course of the war. The mobilization of northern resources—not just men, but the manufacture of the arms and supplies needed to fight a war—enabled the North to conduct what some have called a total war, one on which the Union expended money and manpower at unprecedented levels. Confederate strategists hoped to break the will of the northern homefront to secure southern independence. Despite the hardships endured in the North, this strategy failed.
On the homefront, women struggled to provide for their families as well as to serve soldiers and the army by sending care packages and doing war work. Family letters reveal the impact of the war on children who lost their fathers either temporarily or permanently. Communities rallied to aid soldiers’ families but were riven by dissension over issues such as conscription and emancipation. Immigrants and African Americans sought a new place in U.S. society by exploiting the opportunities the war offered to prove their worth. Service in the Union army certainly advanced the status of some groups, but was not the only means to that end. Nuns who nursed the wounded improved the reputation of the Catholic Church and northern African Americans used the increasingly emancipationist war goals to improve their legal status in the North. The Civil War altered race relations most radically, but change came to everyone on the northern homefront.
The development of military arms harnessing nuclear energy for mass destruction has inspired continual efforts to control them. Since 1945, the United States, the Soviet Union, the United Kingdom, France, the People’s Republic of China (PRC), Israel, India, Pakistan, North Korea, and South Africa acquired control over these powerful weapons, though Pretoria dismantled its small cache in 1989 and Russia inherited the Soviet arsenal in 1996. Throughout this period, Washington sought to limit its nuclear forces in tandem with those of Moscow, prevent new states from fielding them, discourage their military use, and even permit their eventual abolition.
Scholars disagree about what explains the United States’ distinct approach to nuclear arms control. The history of U.S. nuclear policy treats intellectual theories and cultural attitudes alongside technical advances and strategic implications. The central debate is one of structure versus agency: whether the weapons’ sheer power, or historical actors’ attitudes toward that power, drove nuclear arms control. Among those who emphasize political responsibility, there are two further disagreements: (1) the relative influence of domestic protest, culture, and politics; and (2) whether U.S. nuclear arms control aimed first at securing the peace by regulating global nuclear forces or at bolstering American influence in the world.
The intensity of nuclear arms control efforts tended to rise or fall with the likelihood of nuclear war. Harry Truman’s faith in the country’s monopoly on nuclear weapons caused him to sabotage early initiatives, while Dwight Eisenhower’s belief in nuclear deterrence led in a similar direction. Fears of a U.S.-Soviet thermonuclear exchange mounted in the late 1950s, stoked by atmospheric nuclear testing and widespread radioactive fallout, which stirred protest movements and diplomatic initiatives. The spread of nuclear weapons to new states motivated U.S. presidents (John Kennedy in the vanguard) to mount a concerted campaign against “proliferation,” climaxing with the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Richard Nixon was exceptional. His reasons for signing the Strategic Arms Limitation Treaty (SALT I) and Anti-Ballistic Missile Treaty (ABM) with Moscow in 1972 were strategic: to buttress the country’s geopolitical position as U.S. armed forces withdrew from Southeast Asia. The rise of protest movements and Soviet economic difficulties after Ronald Reagan entered the Oval Office brought about two more landmark U.S.-Soviet accords—the 1987 Intermediate Ballistic Missile Treaty (INF) and the 1991 Strategic Arms Reduction Treaty (START)—the first occasions on which the superpowers eliminated nuclear weapons through treaty. The country’s attention swung to proliferation after the Soviet collapse in December 1991, as failed states, regional disputes, and non-state actors grew more prominent. Although controversies over Iraq, North Korea, and Iran’s nuclear programs have since erupted, Washington and Moscow continued to reduce their arsenals and refine their nuclear doctrines even as President Barack Obama proclaimed his support for a nuclear-free world.
Christopher J. Castañeda
The modern oil industry began in 1859 with Edwin Drake’s discovery of oil at Titusville, Pennsylvania. Since then, this dynamic industry has experienced dramatic episodes of growth, aggressive competition for market share, various forms of corporate organization and cartel-like agreements, and governmental efforts at regulation and control, as well as monopoly, mergers, and consolidation. The history of the oil industry reflects its capital-intensive nature. Immense sums of money are spent on oil discovery, production, and refining projects. Marketing, transportation, and distribution systems likewise require enormous amounts of financing and logistical planning. Although oil is often produced in conjunction with, or in wells pressurized by, natural gas, the oil industry is distinct from the related natural gas industry. Since its origins in the mid-19th century, the oil industry has developed an industrial structure that emphasizes scale and scope to maximize profits. Profits can be huge, which attracts entrepreneurial efforts on individual, corporate, and national scales. By the late 20th through early 21st century, the oil industry had begun confronting questions about long-term viability, combined with an increasingly influential environmental movement that seeks to reduce fossil fuel consumption and prevent its toxic waste and by-products from polluting human, animal habitats, and natural habitats.
The relationship between organized labor and the civil rights movement proceeded along two tracks. At work, the two groups were adversaries, as civil rights groups criticized employment discrimination by the unions. But in politics, they allied. Unions and civil rights organizations partnered to support liberal legislation and to oppose conservative southern Democrats, who were as militant in opposing unions as they were fervent in supporting white supremacy.
At work, unions dithered in their efforts to root out employment discrimination. Their initial enthusiasm for Title VII of the 1964 Civil Rights Act, which outlawed employment discrimination, waned the more the new law violated foundational union practices by infringing on the principle of seniority, emphasizing the rights of the individual over the group, and inserting the courts into the workplace. The two souls of postwar liberalism— labor solidarity represented by unions and racial justice represented by the civil rights movement—were in conflict at work.
Although the unions and civil rights activists were adversaries over employment discrimination, they united in trying to register southern blacks to vote. Black enfranchisement would end the South’s exceptionalism and the veto it exercised over liberal legislation in Congress. But the two souls of liberalism that were at odds over the meaning of fairness at work would also diverge at the ballot box. As white workers began to defect from the Democratic Party, the political coalition of black and white workers that union leaders had hoped to build was undermined from below. The divergence between the two souls of liberalism in the 1960s—economic justice represented by unions and racial justice represented by civil rights—helps explain the resurgence of conservatism that followed.
Jessica M. Chapman
The origins of the Vietnam War can be traced to France’s colonization of Indochina in the late 1880s. The Viet Minh, led by Ho Chi Minh, emerged as the dominant anti-colonial movement by the end of World War II, though Viet Minh leaders encountered difficulties as they tried to consolidate their power on the eve of the First Indochina War against France. While that war was, initially, a war of decolonization, it became a central battleground of the Cold War by 1950. The lines of future conflict were drawn that year when the Peoples Republic of China and the Soviet Union recognized and provided aid to the Democratic Republic of Vietnam in Hanoi, followed almost immediately by Washington’s recognition of the State of Vietnam in Saigon. From that point on, American involvement in Vietnam was most often explained in terms of the Domino Theory, articulated by President Dwight D. Eisenhower on the eve of the Geneva Conference of 1954. The Franco-Viet Minh ceasefire reached at Geneva divided Vietnam in two at the 17th parallel, with countrywide reunification elections slated for the summer of 1956. However, the United States and its client, Ngo Dinh Diem, refused to participate in talks preparatory to those elections, preferring instead to build South Vietnam as a non-communist bastion. While the Vietnamese communist party, known as the Vietnam Worker’s Party in Hanoi, initially hoped to reunify the country by peaceful means, it reached the conclusion by 1959 that violent revolution would be necessary to bring down the “American imperialists and their lackeys.” In 1960, the party formed the National Liberation Front for Vietnam and, following Diem’s assassination in 1963, passed a resolution to wage all-out war in the south in an effort to claim victory before the United States committed combat troops. After President John F. Kennedy took office in 1961, he responded to deteriorating conditions in South Vietnam by militarizing the American commitment, though he stopped short of introducing dedicated ground troops. After Diem and Kennedy were assassinated in quick succession in November 1963, Lyndon Baines Johnson took office determined to avoid defeat in Vietnam, but hoping to prevent the issue from interfering with his domestic political agenda. As the situation in South Vietnam became more dire, LBJ found himself unable to maintain the middle-of-the-road approach that Kennedy had pursued. Forced to choose between escalation and withdrawal, he chose the former in March 1965 by launching a sustained campaign of aerial bombardment, coupled with the introduction of the first officially designated U.S. combat forces to Vietnam.
Michael E. Donoghue
The United States’ construction and operation of the Panama Canal began as an idea and developed into a reality after prolonged diplomatic machinations to acquire the rights to build the waterway. Once the canal was excavated, a century-long struggle ensued to hold it in the face of Panamanian nationalism. Washington used considerable negotiation and finally gunboat diplomacy to achieve its acquisition of the Canal. The construction of the channel proved a titanic effort with large regional, global, and cultural ramifications. The importance of the Canal as a geostrategic and economic asset was magnified during the two world wars. But rising Panamanian frustration over the U.S. creation of a state-within-a-state via the Canal Zone, one with a discriminatory racial structure, fomented a local movement to wrest control of the Canal from the Americans. The explosion of the 1964 anti-American uprising drove this process forward toward the 1977 Carter-Torrijos treaties that established a blueprint for eventual U.S. retreat and transfer of the channel to Panama at the century’s end. But before that historic handover, the Noriega crisis and the 1989 U.S. invasion nearly upended the projected transition of U.S. retreat from the management and control of the Canal.
Early historians emphasized high politics, economics, and military considerations in the U.S. acquisition of the Canal. They concentrated on high-status actors, economic indices, and major political contingencies in establishing the U.S. colonial order on the isthmus. Panamanian scholars brought a legalistic and nationalist critique, stressing that Washington did not create Panama and that local voices in the historical debate have largely been ignored in the grand narrative of the Canal as a great act of progressive civilization. More recent U.S. scholarship has focused on American imperialism in Panama, on the role of race, culture, labor, and gender as major factors that shaped the U.S. presence, the structure of the Canal Zone, as well as Panamanian resistance to its occupation. The role of historical memory, of globalization, representation, and how the Canal fits into notions of U.S. empire have also figured more prominently in recent scholarly examination of this relationship. Contemporary research on the Panama Canal has been supported by numerous archives in the United States and Panama, as well as a variety of newspapers, magazines, novels, and films.
The creation and evolution of urban parks is in some ways a familiar story, especially given the attention that Frederick Law Olmsted’s work has commanded since the early 1970s. Following the success of Central Park, cities across the United States began building parks to meet the recreational needs of residents, and during the second half of the 19th century, Olmsted and his partners designed major parks or park systems in thirty cities. Yet, even that story is incomplete. To be sure, Olmsted believed that every city should have a large rural park as an alternative to the density of building and crowding of the modern metropolis, a place to provide for an “unbending of the faculties,” a process of recuperation from the stresses and strains of urban life. But, even in the mid-1860s he sought to create alternative spaces for other types of recreation. Olmsted and his partner Calvert Vaux successfully persuaded the Prospect Park commission, in Brooklyn, New York, to acquire land for a parade ground south of the park as a place for military musters and athletics; moreover, in 1868 they prepared a plan for a park system in Buffalo, New York, that consisted of three parks, linked by parkways, that served different functions and provided for different forms of recreation. As the decades progressed, Olmsted became a champion of parks designed for active recreation; gymnasiums for women as well as men, especially in working-class areas of cities; and playgrounds for small children. He did so in part to relieve pressure on the large landscape parks to accommodate uses he believed would be inappropriate, but also because he recognized the legitimate demands for new forms of recreation. In later years, other park designers and administrators would similarly add facilities for active recreation, though sometimes in ways that compromised what Olmsted considered the primary purpose of a public park. Urban parks are, in important ways, a microcosm of the nation’s cities. Battles over location, financing, political patronage, and use have been a constant. Through it all, parks have evolved to meet the changing recreational needs of residents. And, as dominant a figure as Olmsted has been, this is a story that antedates his professional career and that includes the many voices that have shaped public parks in U.S. cities in the 20th century.
Peace activism in the United States between 1945 and the 2010s focused mostly on opposition to U.S. foreign policy, efforts to strengthen and foster international cooperation, and support for nuclear nonproliferation and arms control. The onset of the Cold War between the United States and the Soviet Union marginalized a reviving postwar American peace movement emerging from concerns about atomic and nuclear power and worldwide nationalist politics that everywhere seemed to foster conflict, not peace. Still, peace activism continued to evolve in dynamic ways and to influence domestic politics and international relations.
Most significantly, peace activists pioneered the use of Gandhian nonviolence in the United States and provided critical assistance to the African American civil rights movement, led the postwar antinuclear campaign, played a major role in the movement against the war in Vietnam, helped to move the liberal establishment (briefly) toward a more dovish foreign policy in the early 1970s, and helped to shape the political culture of American radicalism. Despite these achievements, the peace movement never regained the political legitimacy and prestige it held in the years before World War II, and it struggled with internal divisions about ideology, priorities, and tactics.
Peace activist histories in the 20th century tended to emphasize organizational or biographical approaches that sometimes carried hagiographic overtones. More recently, historians have applied the methods of cultural history, examining the role of religion, gender, and race in structuring peace activism. The transnational and global turn in the historical discipline has also begun to make inroads in peace scholarship. These are promising new directions because they situate peace activism within larger historical and cultural developments and relate peace history to broader historiographical debates and trends.