1-20 of 165 Results  for:

  • 20th Century: Post-1945 x
Clear all

Article

During the Holocene, the present geological epoch, an increasing portion of humans began to manipulate the reproduction of plants and animals in a series of environmental practices known as agriculture. No other ecological relationship sustains as many humans as farming; no other has transformed the landscape to the same extent. The domestication of plants by American Indians followed the end of the last glacial maximum (the Ice Age). About eight thousand years ago, the first domesticated maize and squash arrived from central Mexico, spreading to every region and as far north as the subarctic boreal forest. The incursion of Europeans into North America set off widespread deforestation, soil depletion, and the spread of settlement, followed by the introduction of industrial machines and chemicals. A series of institutions sponsored publically funded research into fertilizers and insecticides. By the late 19th century, writers and activists criticized the technological transformation of farming as destructive to the environment and rural society. During the 20th century, wind erosion contributed to the depopulation of much of the Great Plains. Vast projects in environmental engineering transformed deserts into highly productive regions of intensive fruit and vegetable production. Throughout much of the 19th and 20th centuries, access to land remained limited to whites, with American Indians, African Americans, Latinas/os, Chinese, and peoples of other ethnicities attempting to gain farms or hold on to the land they owned. Two broad periods describe the history of agriculture and the environment in that portion of North America that became the United States. In the first, the environment dominated, forcing humans to adapt during the end of thousands of years of extreme climate variability. In the second, institutional and technological change became more significant, though the environment remained a constant factor against which American agriculture took shape. A related historical pattern within this shift was the capitalist transformation of the United States. For thousands of years, households sustained themselves and exchanged some of what they produced for money. But during the 19th century among a majority of American farmers, commodities took over the entire purpose of agriculture, transforming environments to reflect commercial opportunity.

Article

Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.

Article

Richard N. L. Andrews

Between 1964 and 2017, the United States adopted the concept of environmental policy as a new focus for a broad range of previously disparate policy issues affecting human interactions with the natural environment. These policies ranged from environmental health, pollution, and toxic exposure to management of ecosystems, resources, and use of the public lands, environmental aspects of urbanization, agricultural practices, and energy use, and negotiation of international agreements to address global environmental problems. In doing so, it nationalized many responsibilities that had previously been considered primarily state or local matters. It changed the United States’ approach to federalism by authorizing new powers for the federal government to set national minimum environmental standards and regulatory frameworks with the states mandated to participate in their implementation and compliance. Finally, it explicitly formalized administrative procedures for federal environmental decision-making with stricter requirements for scientific and economic justification rather than merely administrative discretion. In addition, it greatly increased public access to information and opportunities for input, as well as for judicial review, thus allowing citizen advocates for environmental protection and appreciative uses equal legitimacy with commodity producers to voice their preferences for use of public environmental resources. These policies initially reflected widespread public demand and broad bipartisan support. Over several decades, however, they became flashpoints, first, between business interests and environmental advocacy groups and, subsequently, between increasingly ideological and partisan agendas concerning the role of the federal government. Beginning in the 1980s, the long-standing Progressive ideal of the “public interest” was increasingly supplanted by a narrative of “government overreach,” and the 1990s witnessed campaigns to delegitimize the underlying evidence justifying environmental policies by labeling it “junk science” or a “hoax.” From the 1980s forward, the stated priorities of environmental policy vacillated repeatedly between presidential administrations and Congresses supporting continuation and expansion of environmental protection and preservation policies versus those seeking to weaken or even reverse protections in favor of private-property rights and more damaging uses of resources. Yet despite these apparent shifts, the basic environmental laws and policies enacted during the 1970s remained largely in place: political gridlock, in effect, maintained the status quo, with the addition of a very few innovations such as “cap and trade” policies. One reason was that environmental policies retained considerable latent public support: in electoral campaigns, they were often overshadowed by economic and other issues, but they still aroused widespread support in their defense when threatened. Another reason was that decisions by the courts also continued to reaffirm many existing policies and to reject attempts to dismantle them. With the election of Donald Trump in 2016, along with conservative majorities in both houses of Congress, US environmental policy came under the most hostile and wide-ranging attack since its origins. More than almost any other issue, the incoming president targeted environmental policy for rhetorical attacks and budget cuts, and sought to eradicate the executive policies of his predecessor, weaken or rescind protective regulations, and undermine the regulatory and even the scientific capacity of the federal environmental agencies. In the early 21st century, it is as yet unclear how much of his agenda will actually be accomplished, or whether, as in past attempts, much of it will ultimately be blocked by Congress, the courts, public backlash, and business and state government interests seeking stable policy expectations rather than disruptive deregulation.

Article

Joshua Gleich

Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films. Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.

Article

American Indian activism after 1945 was as much a part of the larger, global decolonization movement rooted in centuries of imperialism as it was a direct response to the ethos of civic nationalism and integration that had gained momentum in the United States following World War II. This ethos manifested itself in the disastrous federal policies of termination and relocation, which sought to end federal services to recognized Indian tribes and encourage Native people to leave reservations for cities. In response, tribal leaders from throughout Indian Country formed the National Congress of American Indians (NCAI) in 1944 to litigate and lobby for the collective well-being of Native peoples. The NCAI was the first intertribal organization to embrace the concepts of sovereignty, treaty rights, and cultural preservation—principles that continue to guide Native activists today. As American Indian activism grew increasingly militant in the late 1960s and 1970s, civil disobedience, demonstrations, and takeovers became the preferred tactics of “Red Power” organizations such as the National Indian Youth Council (NIYC), the Indians of All Tribes, and the American Indian Movement (AIM). At the same time, others established more focused efforts that employed less confrontational methods. For example, the Native American Rights Fund (NARF) served as a legal apparatus that represented Native nations, using the courts to protect treaty rights and expand sovereignty; the Council of Energy Resource Tribes (CERT) sought to secure greater returns on the mineral wealth found on tribal lands; and the American Indian Higher Education Consortium (AIHEC) brought Native educators together to work for greater self-determination and culturally rooted curricula in Indian schools. While the more militant of these organizations and efforts have withered, those that have exploited established channels have grown and flourished. Such efforts will no doubt continue into the unforeseeable future so long as the state of Native nations remains uncertain.

Article

American activists who challenged South African apartheid during the Cold War era extended their opposition to racial discrimination in the United States into world politics. US antiapartheid organizations worked in solidarity with forces struggling against the racist regime in South Africa and played a significant role in the global antiapartheid movement. More than four decades of organizing preceded the legislative showdown of 1986, when a bipartisan coalition in Congress overrode President Ronald Reagan’s veto, to enact economic sanctions against the apartheid regime in South Africa. Adoption of sanctions by the United States, along with transnational solidarity with the resistance to apartheid by South Africans, helped prompt the apartheid regime to relinquish power and allow the democratic elections that brought Nelson Mandela and the African National Congress to power in 1994. Drawing on the tactics, strategies and moral authority of the civil rights movement, antiapartheid campaigners mobilized public opinion while increasing African American influence in the formulation of US foreign policy. Long-lasting organizations such as the American Committee on Africa and TransAfrica called for boycotts and divestment while lobbying for economic sanctions. Utilizing tactics such as rallies, demonstrations, and nonviolent civil disobedience actions, antiapartheid activists made their voices heard on college campuses, corporate boardrooms, municipal and state governments, as well as the halls of Congress. Cultural expressions of criticism and resistance served to reinforce public sentiment against apartheid. Novels, plays, movies, and music provided a way for Americans to connect to the struggles of those suffering under apartheid. By extending the moral logic of the movement for African American civil rights, American anti-apartheid activists created a multicultural coalition that brought about institutional and governmental divestment from apartheid, prompted Congress to impose economic sanctions on South Africa, and increased the influence of African Americans regarding issues of race and American foreign policy.

Article

As places of dense habitation, cities have always required coordination and planning. City planning has involved the design and construction of large-scale infrastructure projects to provide basic necessities such as a water supply and drainage. By the 1850s, immigration and industrialization were fueling the rise of big cities, creating immense, collective problems of epidemics, slums, pollution, gridlock, and crime. From the 1850s to the 1900s, both local governments and utility companies responded to this explosive physical and demographic growth by constructing a “networked city” of modern technologies such as gaslight, telephones, and electricity. Building the urban environment also became a wellspring of innovation in science, medicine, and administration. In 1909–1910, a revolutionary idea—comprehensive city planning—opened a new era of professionalization and institutionalization in the planning departments of city halls and universities. Over the next thirty-five years, however, wars and depression limited their influence. From 1945 to 1965, in contrast, represents the golden age of formal planning. During this unprecedented period of peace and prosperity, academically trained experts played central roles in the modernization of the inner cities and the sprawl of the suburbs. But the planners’ clean-sweep approach to urban renewal and the massive destruction caused by highway construction provoked a revolt of the grassroots. Beginning in the Watts district of Los Angeles in 1965, mass uprisings escalated over the next three years into a national crisis of social disorder, racial and ethnic inequality, and environmental injustice. The postwar consensus of theory and practice was shattered, replaced by a fragmented profession ranging from defenders of top-down systems of computer-generated simulations to proponents of advocacy planning from the bottom up. Since the late 1980s, the ascendency of public-private partnerships in building the urban environment has favored the planners promoting systems approaches, who promise a future of high-tech “smart cities” under their complete control.

Article

On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. Need Another War on Poverty?” While the answers varied, all the invited debaters accepted the martial premise of the question—that a war on poverty had been fought and that eliminating poverty was, without a doubt, a “fight,” or a “battle.” Yet the debate over the manner—martial or not—by which the federal government and public policy has dealt with the issue of poverty in the United States is still very much an open-ended one. The evolution and development of the postwar American welfare state is a story not only of a number of “wars,” or individual political initiatives, against poverty, but also about the growth of institutions within and outside government that seek to address, alleviate, and eliminate poverty and its concomitant social ills. It is a complex and at times messy story, interwoven with the wider historical trajectory of this period: civil rights, the rise and fall of a “Cold War consensus,” the emergence of a counterculture, the Vietnam War, the credibility gap, the rise of conservatism, the end of “welfare,” and the emergence of compassionate conservatism. Mirroring the broader organization of the American political system, with a relatively weak center of power and delegated authority and decision-making in fifty states, the welfare model has developed and grown over decades. Policies viewed in one era as unmitigated failures have instead over time evolved and become part of the fabric of the welfare state.

Article

Antimonopoly, meaning opposition to the exclusive or near-exclusive control of an industry or business by one or a very few businesses, played a relatively muted role in the history of the post-1945 era, certainly compared to some earlier periods in American history. However, the subject of antimonopoly is important because it sheds light on changing attitudes toward concentrated power, corporations, and the federal government in the United States after World War II. Paradoxically, as antimonopoly declined as a grass-roots force in American politics, the technical, expert-driven field of antitrust enjoyed a golden age. From the 1940s to the 1960s, antitrust operated on principles that were broadly in line with those that inspired its creation in the late 19th and early 20th century, acknowledging the special contribution small-business owners made to US democratic culture. In these years, antimonopoly remained sufficiently potent as a political force to sustain the careers of national-level politicians such as congressmen Wright Patman and Estes Kefauver and to inform the opinions of Supreme Court justices such as Hugo Black and William O. Douglas. Antimonopoly and consumer politics overlapped in this period. From the mid-1960s onward, Ralph Nader repeatedly tapped antimonopoly ideas in his writings and consumer activism, skillfully exploiting popular anxieties about concentrated economic power. At the same time, as part of the United States’ rise to global hegemony, officials in the federal government’s Antitrust Division exported antitrust overseas, building it into the political, economic, and legal architecture of the postwar world. Beginning in the 1940s, conservative lawyers and economists launched a counterattack against the conception of antitrust elaborated in the progressive era. By making consumer welfare—understood in terms of low prices and market efficiency—the determining factor in antitrust cases, they made a major intellectual and political contribution to the rightward thrust of US politics in the 1970s and 1980s. Robert Bork’s The Antitrust Paradox, published in 1978, popularized and signaled the ascendency of this new approach. In the 1980s and 1990s antimonopoly drifted to the margin of political debate. Fear of big government now loomed larger in US politics than the specter of monopoly or of corporate domination. In the late 20th century, Americans, more often than not, directed their antipathy toward concentrated power in its public, rather than its private, forms. This fundamental shift in the political landscape accounts in large part for the overall decline of antimonopoly—a venerable American political tradition—in the period 1945 to 2000.

Article

In 1964, President Lyndon B. Johnson announced an unconditional “war on poverty.” On one of his first publicity tours promoting his antipoverty legislation, he traveled to cities and towns in Appalachia, which would become crucial areas for promoting and implementing the legislation. Johnson soon signed the Economic Opportunity Act, a piece of legislation that provided a structure for communities to institute antipoverty programs, from vocational services to early childhood education programs, and encouraged the creation of new initiatives. In 1965, Johnson signed the Appalachian Regional Development Act, making Appalachia the only region targeted by federal antipoverty legislation, through the creation of the Appalachian Regional Commission. The Appalachian War on Poverty can be described as a set of policies created by governmental agencies, but also crucial to it was a series of community movements and campaigns, led by working-class people, that responded to antipoverty policies. When the War on Poverty began, the language of policymakers suggested that people living below the poverty line would be served by the programs. But as the antipoverty programs expanded and more local people became involved, they spoke openly and in political terms about poverty as a working-class issue. They drew attention to the politics of class in the region, where elites and absentee landowners became wealthy on the backs of working people. They demanded meaningful participation in shaping the War on Poverty in their communities, and, increasingly, when they used the term “poor people,” they did so as a collective class identity—working people who were poor due to a rigged economy. While many public officials focused on economic development policies, men and women living in the region began organizing around issues ranging from surface mining to labor rights and responding to poor living and working conditions. Taking advantage of federal antipoverty resources and the spirit of change that animated the 1960s, working-class Appalachians would help to shape the antipoverty programs at the local and regional level, creating a movement in the process. They did so as they organized around issues—including the environment, occupational safety, health, and welfare rights—and as they used antipoverty programs as a platform to address the systemic inequalities that plagued many of their communities.

Article

American policy toward the Arab-Israeli conflict has reflected dueling impulses at the heart of US-Middle East relations since World War II: growing support for Zionism and Israeli statehood on the one hand, the need for cheap oil resources and strong alliances with Arab states on the other, unfolding alongside the ebb and flow of concerns over Soviet influence in the region during the Cold War. These tensions have tracked with successive Arab–Israeli conflagrations, from the 1948 war through the international conflicts of 1967 and 1973, as well as shifting modes of intervention in Lebanon, and more recently, the Palestinian uprisings in the occupied territories and several wars on the Gaza Strip. US policy has been shaped by diverging priorities in domestic and foreign policy, a halting recognition of the need to tackle Palestinian national aspirations, and a burgeoning peace process which has drawn American diplomats into the position of mediating between the parties. Against the backdrop of regional upheaval, this long history of involvement continues into the 21st century as the unresolved conflict between Israel and the Arab world faces a host of new challenges.

Article

Jennifer Hoyt

Relations between the United States and Argentina can be best described as a cautious embrace punctuated by moments of intense frustration. Although never the center of U.S.–Latin American relations, Argentina has attempted to create a position of influence in the region. As a result, the United States has worked with Argentina and other nations of the Southern Cone—the region of South America that comprises Uruguay, Paraguay, Argentina, Chile, and southern Brazil—on matters of trade and economic development as well as hemispheric security and leadership. While Argentina has attempted to assert its position as one of Latin America’s most developed nations and therefore a regional leader, the equal partnership sought from the United States never materialized for the Southern Cone nation. Instead, competition for markets and U.S. interventionist and unilateral tendencies kept Argentina from attaining the influence and wealth it so desired. At the same time, the United States saw Argentina as an unreliable ally too sensitive to the pull of its volatile domestic politics. The two nations enjoyed moments of cooperation in World War I, the Cold War, and the 1990s, when Argentine leaders could balance this particular external partnership with internal demands. Yet at these times Argentine leaders found themselves walking a fine line as detractors back home saw cooperation with the United States as a violation of their nation’s sovereignty and autonomy. There has always been potential for a productive partnership, but each side’s intransigence and unique concerns limited this relationship’s accomplishments and led to a historical imbalance of power.

Article

The global political divides of the Cold War propelled the dismantling of Asian exclusion in ways that provided greater, if conditional, integration for Asian Americans, in a central aspect of the reworking of racial inequality in the United States after World War II. The forging of strategic alliances with Asian nations and peoples in that conflict mandated at least token gestures of greater acceptance and equity, in the form of changes to immigration and citizenship laws that had previously barred Asians as “aliens ineligible to citizenship.”1 During the Cold War, shared politics and economic considerations continued to trump racial difference as the United States sought leadership of the “free” capitalist world and competed with Soviet-led communism for the affiliation and cooperation of emerging, postcolonial Third World nations. U.S. courtship of once-scorned peoples required the end of Jim Crow systems of segregation through the repeal of discriminatory laws, although actual practices and institutions proved far more resistant to change. Politically and ideologically, culture and values came to dominate explanations for categories and inequalities once attributed to differences in biological race. Mainstream media and cultural productions celebrated America’s newfound embrace of its ethnic populations, even as the liberatory aspirations inflamed by World War II set in motion the civil rights movement and increasingly confrontational mobilizations for greater access and equality. These contestations transformed the character of America as a multiracial democracy, with Asian Americans advancing more than any other racial group to become widely perceived as a “model minority” by the 1980s with the popularization of a racial trope first articulated during the 1960s. Asian American gains were attained in part through the diminishing of barriers in immigration, employment, residence, education, and miscegenation, but also because their successes affirmed U.S. claims regarding its multiracial democracy and because reforms of immigration law admitted growing numbers of Asians who had been screened for family connections, refugee status, and especially their capacity to contribute economically. The 1965 Immigration Act cemented these preferences for educated and skilled Asian workers, with employers assuming great powers as routes to immigration and permanent status. The United States became the chief beneficiary of “brain drain” from Asian countries. Geometric rates of Asian American population growth since 1965, disproportionately screened through this economic preference system, have sharply reduced the ranks of Asian Americans linked to the exclusion era and set them apart from Latino, black, and Native Americans who remain much more entrenched in the systems of inequality rooted in the era of sanctioned racial segregation.

Article

Ana Elizabeth Rosas

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. On August 4, 1942, the Mexican and U.S. governments launched the bi-national guest worker program, most commonly known as the Bracero Program. An estimated five million Mexican men between the ages of 19 and 45 separated from their families for three-to-nine-month contract cycles at a time, in anticipation of earning the prevailing U.S. wage this program had promised them. They labored in U.S. agriculture, railroad construction, and forestry, with hardly any employment protections or rights in place to support themselves and the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting such goals. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for, at best, fragile family relationships. This program lasted twenty-two years and grew in its expanse, despite its negative consequences, Mexican men and their families could not afford to settle for being unemployed in Mexico, nor could they pass up U.S. employment opportunities of any sort. The Mexican and U.S. governments’ persistently negligent management of the Bracero Program, coupled with their conveniently selective acknowledgement of the severity of the plight of Mexican women and men, consistently cornered Mexican men and their families to shoulder the full extent of the Bracero Program’s exploitative conditions and terms.

Article

David Blanke

The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.

Article

Tyson Reeder

The United States has shared an intricate and turbulent history with Caribbean islands and nations since its inception. In its relations with the Caribbean, the United States has displayed the dueling tendencies of imperialism and anticolonialism that characterized its foreign policy with South America and the rest of the world. For nearly two and a half centuries, the Caribbean has stood at the epicenter of some of the US government’s most controversial and divisive foreign policies. After the American Revolution severed political ties between the United States and the British West Indies, US officials and traders hoped to expand their political and economic influence in the Caribbean. US trade in the Caribbean played an influential role in the events that led to the War of 1812. The Monroe Doctrine provided a blueprint for reconciling imperial ambitions in the Caribbean with anti-imperial sentiment. During the mid-19th century, Americans debated the propriety of annexing Caribbean islands, especially Cuba. After the Spanish-American War of 1898, the US government took an increasingly imperialist approach to its relations with the Caribbean, acquiring some islands as federal territories and augmenting its political, military, and economic influence in others. Contingents of the US population and government disapproved of such imperialistic measures, and beginning in the 1930s the US government softened, but did not relinquish, its influence in the Caribbean. Between the 1950s and the end of the Cold War, US officials wrestled with how to exert influence in the Caribbean in a postcolonial world. Since the end of the Cold War, the United States has intervened in Caribbean domestic politics to enhance democracy, continuing its oscillation between democratic and imperial impulses.

Article

The NAACP, established in 1909, was formed as an integrated organization to confront racism in the United States rather than seeing the issue as simply a southern problem. It is the longest running civil rights organization and continues to operate today. The original name of the organization was The National Negro League, but this was changed to the NAACP on May 30, 1910. Organized to promote racial equality and integration, the NAACP pursued this goal via legal cases, political lobbying, and public campaigns. Early campaigns involved lobbying for national anti-lynching legislation, pursuing through the US Supreme Court desegregation in areas such as housing and higher education, and the pursuit of voting rights. The NAACP is renowned for the US Supreme Court case of Brown v. Board of Education (1954) that desegregated primary and secondary schools and is seen as a catalyst for the civil rights movement (1955–1968). It also advocated public education by promoting African American achievements in education and the arts to counteract racial stereotypes. The organization published a monthly journal, The Crisis, and promoted African American art forms and culture as another means to advance equality. NAACP branches were established all across the United States and became a network of information, campaigning, and finance that underpinned activism. Youth groups and university branches mobilized younger members of the community. Women were also invaluable to the NAACP in local, regional, and national decision-making processes and campaigning. The organization sought to integrate African Americans and other minorities into the American social, political, and economic model as codified by the US Constitution.

Article

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.