1-20 of 130 Results

  • Keywords: war x
Clear all

Article

Thomas A. Reinstein

The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community. At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.

Article

Since the late 19th century, the relationship between journalists and the makers of US foreign policy has been both cooperative and contentious. Reporters depend on government officials for information about policy decisions and their implementation. The White House, the State Department, and the Pentagon use the news media to build support for their policies and, at times, to communicate directly with allies and adversaries. Since World War I, presidential administrations have developed increasingly sophisticated methods to manage the news and influence public understanding of international affairs. Wartime censorship has been one tool of news management. Self-censorship, however, has also affected coverage of international affairs, as journalists have voluntarily refrained from publishing information for fear of impairing national security or undermining support for US wartime or Cold War policies. Allegations of bias and sensationalism became acrimonious during the Vietnam War and have continued to shape the debate about accurate, critical, and legitimate reporting. Arguments over “fake news,” which became commonplace during the presidency of Donald J. Trump, have many precursors, as both journalists and government officials have been responsible for misleading or distorted news coverage of international affairs since the Spanish–American War.

Article

American history is replete with instances of counterinsurgency. An unsurprising reality considering the United States has always participated in empire building, thus the need to pacify resistance to expansion. For much of its existence, the U.S. has relied on its Army to pacify insurgents. While the U.S. Army used traditional military formations and use of technology to battle peer enemies, the same strategy did not succeed against opponents who relied on speed and surprise. Indeed, in several instances, insurgents sought to fight the U.S. Army on terms that rendered superior manpower and technology irrelevant. By introducing counterinsurgency as a strategy, the U.S. Army attempted to identify and neutralize insurgents and the infrastructure that supported them. Discussions of counterinsurgency include complex terms, thus readers are provided with simplified, yet accurate definitions and explanations. Moreover, understanding the relevant terms provided continuity between conflicts. While certain counterinsurgency measures worked during the American Civil War, the Indian Wars, and in the Philippines, the concept failed during the Vietnam War. The complexities of counterinsurgency require readers to familiarize themselves with its history, relevant scholarship, and terminology—in particular, counterinsurgency, pacification, and infrastructure.

Article

Canada has sometimes been called the United States’ attic: a useful feature, but one easily forgotten. Of all countries, it has historically resembled the United States the most closely, in terms of culture, geography, economy, society, politics, ideology and, especially, history. A shared culture—literary, social, legal, and political—is a crucial factor in Canadian-American relations. Geography is at least as important. It provides the United States with strategic insulation to the north and enhances geographic isolation to the east and west. North-south economic links are inevitable and very large. It has been a major recipient of American investment, and for most of the time since 1920 has been the United States’ principal trading partner. Prosperous and self-sufficient, it has seldom required American aid. There have been no overtly hostile official encounters since the end of the War of 1812, partly because many Americans tended to believe that Canadians would join the republic; when that did not occur, the United States accepted an independent but friendly Canada as a permanent, useful, and desirable neighbor—North America’s attic. The insulation the attic provided was a common belief in the rule of law, both domestic and international; liberal democracy; a federal constitution; liberal capitalism; and liberal international trade regimes. That said, the United States, with its large population, huge economy, and military power, insulates Canada from hostile external forces. An attack on Canada from outside the continent is hard to imagine without a simultaneous attack on the United States. Successive American and Canadian governments have reaffirmed the political status quo while favoring mutually beneficial economic and military linkages—bilateral and multilateral. Relations have traditionally been grounded in a negotiating style that is evidence-based, proceeding issue by issue. A sober diplomatic and political context sometimes frames irritations and exclamations, but even these have usually been defined and limited by familiarity. For example, there has always been anti-Americanism in Canada. Most often it consists of sentiments derived from the United States itself, channeled by cultural similarities. No American idea, good or bad, from liberalism to populism, fails to find an echo in Canada. How loud or how soft the echo makes the difference.

Article

Relations between the United States and Mexico have rarely been easy. Ever since the United States invaded its southern neighbor and seized half of its national territory in the 19th century, the two countries have struggled to establish a relationship based on mutual trust and respect. Over the two centuries since Mexico’s independence, the governments and citizens of both countries have played central roles in shaping each other’s political, economic, social, and cultural development. Although this process has involved—even required—a great deal of cooperation, relations between the United States and Mexico have more often been characterized by antagonism, exploitation, and unilateralism. This long history of tensions has contributed to the three greatest challenges that these countries face together today: economic development, immigration, and drug-related violence.

Article

Racism and xenophobia, but also resilience and community building, characterize the return of thousands of Japanese Americans, or Nikkei, to the West Coast after World War II. Although the specific histories of different regions shaped the resettlement experiences for Japanese Americans, Los Angeles provides an instructive case study. For generations, the City of Angels has been home to one of the nation’s largest and most diverse Nikkei communities and the ways in which Japanese Americans rebuilt their lives and institutions resonate with the resettlement experience elsewhere. Before World War II, greater Los Angeles was home to a vibrant Japanese American population. First generation immigrants, or Issei, and their American-born children, the Nisei, forged dynamic social, economic, cultural, and spiritual institutions out of various racial exclusions. World War II uprooted the community as Japanese Americans left behind their farms, businesses, and homes. In the best instances, they were able to entrust their property to neighbors or other sympathetic individuals. More often, the uncertainty of their future led Japanese Americans to sell off their property, far below the market price. Upon the war’s end, thousands of Japanese Americans returned to Los Angeles, often to financial ruin. Upon their arrival in the Los Angeles area, Japanese Americans continued to face deep-seated prejudice, all the more accentuated by an overall dearth of housing. Without a place to live, they sought refuge in communal hostels set up in pre-war institutions that survived the war such as a variety of Christian and Buddhist churches. Meanwhile, others found housing in temporary trailer camps set up by the War Relocation Authority (WRA), and later administered by the Federal Public Housing Authority (FPHA), in areas such as Burbank, Sun Valley, Hawthorne, Santa Monica, and Long Beach. Although some local religious groups and others welcomed the returnees, white homeowners, who viewed the settlement of Japanese Americans as a threat to their property values, often mobilized to protest the construction of these camps. The last of these camps closed in 1956, demonstrating the hardship some Japanese Americans still faced in integrating back into society. Even when the returnees were able to leave the camps, they still faced racially restrictive housing covenants and, when those practices were ruled unconstitutional, exclusionary lending. Although new suburban enclaves of Japanese Americans eventually developed in areas such as Gardena, West Los Angeles, and Pacoima by the 1960s, the pathway to those destinations was far from easy. Ultimately, the resettlement of Japanese Americans in Los Angeles after their mass incarceration during World War II took place within the intertwined contexts of lingering anti-Japanese racism, Cold War politics, and the suburbanization of Southern California.

Article

From 1775 to 1815, empire served as the most pressing foreign relationship problem for the United States. Would the new nation successfully break free from the British Empire? What would an American empire look like? How would it be treated by other empires? And could Americans hold their own against European superpowers? These questions dominated the United States’ first few decades of existence and shaped its interactions with American Indian, Haitian, Spanish, British, and French peoples. The US government—first the Continental Congress, then the Confederation Congress, and finally the federal administration under the new Constitution—grappled with five key issues. First, they sought international recognition of their independence and negotiated trade deals during the Revolutionary War to support the war effort. Second, they obtained access to the Mississippi River and Port of New Orleans from Spain and France to facilitate trade and western settlement. Third, they grappled with ongoing conflict with Indian nations over white settlement on Indian lands and demands from white communities for border security. Fourth, they defined and protected American neutrality, negotiated a trade policy that required European recognition of American independence, and denied recognition to Haiti. Lastly, they fought a quasi-war with France and real war with Great Britain in 1812.

Article

For nearly a decade, American combat soldiers fought in South Vietnam to help sustain an independent, noncommunist nation in Southeast Asia. After U.S. troops departed in 1973, the collapse of South Vietnam in 1975 prompted a lasting search to explain the United States’ first lost war. Historians of the conflict and participants alike have since critiqued the ways in which civilian policymakers and uniformed leaders applied—some argued misapplied—military power that led to such an undesirable political outcome. While some claimed U.S. politicians failed to commit their nation’s full military might to a limited war, others contended that most officers fundamentally misunderstood the nature of the war they were fighting. Still others argued “winning” was essentially impossible given the true nature of a struggle over Vietnamese national identity in the postcolonial era. On their own, none of these arguments fully satisfy. Contemporary policymakers clearly understood the difficulties of waging a war in Southeast Asia against an enemy committed to national liberation. Yet the faith of these Americans in their power to resolve deep-seated local and regional sociopolitical problems eclipsed the possibility there might be limits to that power. By asking military strategists to simultaneously fight a war and build a nation, senior U.S. policymakers had asked too much of those crafting military strategy to deliver on overly ambitious political objectives. In the end, the Vietnam War exposed the limits of what American military power could achieve in the Cold War era.

Article

In December 1979, Soviet troops entered the small, poor, landlocked, Islamic nation of Afghanistan, assassinated the communist president, Hafizullah Amin, and installed a more compliant Afghan leader. For almost ten years, Soviet troops remained entrenched in Afghanistan before finally withdrawing in February 1989. During this period, the United States undertook a covert program to assist the anti-communist Afghan insurgents—the mujahideen—to resist the Soviet occupation. Beginning with President Jimmy Carter’s small-scale authorization in July 1979, the secret war became the largest in history under President Ronald Reagan, running up to $700 million per year. The Central Intelligence Agency (CIA) acted as the war’s quartermaster, arranging supplies of weapons for the mujahideen, which were funneled through Pakistan’s Inter-Services Intelligence directorate (ISI), in coordination with Saudi Arabia, China, Egypt, and others. No Americans were directly involved in the fighting, and the overall cost to the American taxpayer was in the region of $2 billion. The Afghan cost was much higher. Over a million Afghans were killed, a further two million wounded, and over six million refugees fled to neighboring Pakistan and Iran. For the Soviet Union, the ten-year war constituted its largest military action in the postwar era, and the long and protracted nature of the conflict and the failure of the Red Army to subdue the Afghans is partially responsible for the internal turmoil that contributed to the eventual breakup of the Soviet empire at the end of the 1980s. The defeat of the Soviet 40th Army in Afghanistan proved to be the final major superpower battle of the Cold War, but it also marked the beginning of a new era. The devastation and radicalization of Afghan society resulted in the subsequent decades of continued conflict and warfare and the rise of militant Islamic fundamentalism that has shaped the post-Cold War world.

Article

Throughout US history, Americans have used ideas about gender to understand power, international relations, military behavior, and the conduct of war. Since Joan Wallach Scott called on scholars in 1986 to consider gender a “useful category of analysis,” historians have looked beyond traditional diplomatic and military sources and approaches to examine cultural sources, the media, and other evidence to try to understand the ideas that Americans have relied on to make sense of US involvement in the world. From casting weak nations as female to assuming that all soldiers are heterosexual males, Americans have deployed mainstream assumptions about men’s and women’s proper behavior to justify US diplomatic and military interventions in the world. State Department pamphlets describing newly independent countries in the 1950s and 1960s featured gendered imagery like the picture of a young Vietnamese woman on a bicycle that was meant to symbolize South Vietnam, a young nation in need of American guidance. Language in news reports and government cables, as well as film representations of international affairs and war, expressed gendered dichotomies such as protector and protected, home front and battlefront, strong and weak leadership, and stable and rogue states. These and other episodes illustrate how thoroughly gender shaped important dimensions about the character and the making of US foreign policy and historians’ examinations of diplomatic and military history.

Article

The military history of the American Revolution is more than the history of the War of Independence. The Revolution itself had important military causes. The experience of the Seven Years’ War (which started in 1754 in North America) conditioned British attitudes to the colonies after that conflict was over. From 1764, the British Parliament tried to raise taxes in America to pay for a new permanent military garrison. British politicians resisted colonial objections to parliamentary taxation at least partly because they feared that if the Americans established their right not to be taxed by Westminster, Parliament’s right to regulate colonial overseas trade would then be challenged. If the Americans broke out of the system of trade regulation, British ministers, MPs, and peers worried, then the Royal Navy would be seriously weakened. The War of Independence, which began in 1775, was not the great American triumph that most accounts suggest. The British army faced a difficult task in suppressing a rebellion three thousand miles from Britain itself. French intervention on the American side in 1778 (followed by the Spanish in 1779, and the Dutch in 1780) made the task still more difficult. In the end, the war in America was won by the French as much as by the Americans. But in the wider imperial conflict, affecting the Caribbean, Central America, Europe, West Africa, and South Asia, the British fared much better. Even in its American dimension, the outcome was less clear cut than we usually imagine. The British, the nominal losers, retained great influence in the independent United States, which in economic terms remained in an essentially dependent relationship with the former mother country.

Article

The American War for Independence lasted eight years. It was one of the longest and bloodiest wars in America’s history, and yet it was not such a protracted conflict merely because the might of the British armed forces was brought to bear on the hapless colonials. The many divisions among Americans themselves over whether to fight, what to fight for, and who would do the fighting often had tragic and violent consequences. The Revolutionary War was by any measure the first American civil war. Yet national narratives of the Revolution and even much of the scholarship on the era focus more on simple stories of a contest between the Patriots and the British. Loyalists and other opponents of the Patriots are routinely left out of these narratives, or given short shrift. So, too, are the tens of thousands of ordinary colonists—perhaps a majority of the population—who were disaffected or alienated from either side or who tried to tack between the two main antagonists to make the best of a bad situation. Historians now estimate that as many as three-fifths of the colonial population were neither active Loyalists nor Patriots. When we take the war seriously and begin to think about narratives that capture the experience of the many, rather than the few, an illuminating picture emerges. The remarkably wide scope of the activities of the disaffected during the war—ranging from nonpayment of taxes to draft dodging and even to armed resistance to protect their neutrality—has to be integrated with older stories of militant Patriots and timid Loyalists. Only then can we understand the profound consequences of disaffection—particularly in creating divisions within the states, increasing levels of violence, prolonging the war, and changing the nature of the political settlements in each state. Indeed, the very divisions among diverse Americans that made the War for Independence so long, bitter, and bloody also explains much of the Revolutionary energy of the period. Though it is not as seamless as traditional narratives of the Revolution would suggest, a more complicated story also helps better explain the many problems the new states and eventually the new nation would face. In making this argument, we may finally suggest ways we can overcome what John Shy long ago noted as the tendency of scholars to separate the ‘destructive’ War for Independence from the ‘constructive’ political Revolution.

Article

Best known as Abraham Lincoln’s secretary of state during the Civil War, William Henry Seward conducted full careers as a statesman, politician, and visionary of America’s future, both before and after that traumatic conflict. His greatest legacy, however, lay in his service as the secretary of state, leading the diplomatic effort to prevent European intervention in the conflict. His success in that effort marked the margin between the salvation and the destruction of the Union. Beyond his role as diplomat, Seward’s signature qualities of energy, optimism, ambition, and opportunism enabled him to assume a role in the Lincoln administration extending well beyond his diplomatic role as the secretary of state. Those same qualities secured a close working relationship with the president as Seward overcame a rocky first few weeks in office to become Lincoln’s confidant and sounding board. Seward’s career in politics stretched from the 1830s until 1869. Through that time, he maintained a vision of a United States of America built on opportunity and free labor, powered by government’s active role in internal improvement and education. He foresaw a nation fated to expand across the continent and overseas, with expansion occurring peacefully as a result of American industrial and economic strength and its model of government. During his second term as secretary of state, under the Johnson administration, Seward attempted a series of territorial acquisitions in the Caribbean, the Pacific, and on the North American continent. The state of the post-war nation and its fractious politics precluded success in most of these attempts, but Seward was successful in negotiating and securing Congressional ratification of the purchase of Alaska in 1867. In addition, Seward pursued a series of policies establishing paths followed later by US diplomats, including the open door in China and the acquisition of Hawaii and US naval bases in the Caribbean.

Article

Sworn in as the 33rd President of the United States following Franklin D. Roosevelt’s death in April 1945, Harry S. Truman faced the daunting tasks of winning the war and ensuring future peace and stability. Chided by critics for his lack of foreign policy experience but championed by supporters for his straightforward decision-making, Truman guided the United States from World War to Cold War. The Truman presidency marked a new era in American foreign relations, with the United States emerging from World War II unmatched in economic strength and military power. The country assumed a leadership position in a postwar world primarily shaped by growing antagonism with the Soviet Union. Truman pursued an interventionist foreign policy that took measures to contain Soviet influence in Europe and stem the spread of communism in Asia. Under his leadership, the United States witnessed the dawn of the atomic age, approved billions of dollars in economic aid to rebuild Europe, supported the creation of multilateral organizations such as the United Nations and North Atlantic Treaty Organization, recognized the state of Israel, and intervened in the Korean peninsula. The challenges Truman confronted and the policies he implemented laid the foundation for 20th-century US foreign relations throughout the Cold War and beyond.

Article

On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history. Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.

Article

Franklin D. Roosevelt was US president in extraordinarily challenging times. The impact of both the Great Depression and World War II make discussion of his approach to foreign relations by historians highly contested and controversial. He was one of the most experienced people to hold office, having served in the Wilson administration as Assistant Secretary of the Navy, completed two terms as Governor of New York, and held a raft of political offices. At heart, he was an internationalist who believed in an engaged and active role for the United States in world. During his first two terms as president, Roosevelt had to temper his international engagement in response to public opinion and politicians wanting to focus on domestic problems and wary of the risks of involvement in conflict. As the world crisis deepened in the 1930s, his engagement revived. He adopted a gradualist approach to educating the American people in the dangers facing their country and led them to eventual participation in war and a greater role in world affairs. There were clearly mistakes in his diplomacy along the way and his leadership often appeared flawed, with an ambiguous legacy founded on political expediency, expanded executive power, vague idealism, and a chronic lack of clarity to prepare Americans for postwar challenges. Nevertheless, his policies to prepare the United States for the coming war saw his country emerge from years of depression to become an economic superpower. Likewise, his mobilization of his country’s enormous resources, support of key allies, and the holding together of a “Grand Alliance” in World War II not only brought victory but saw the United States become a dominant force in the world. Ultimately, Roosevelt’s idealistic vision, tempered with a sound appreciation of national power, would transform the global position of the United States and inaugurate what Henry Luce described as “the American Century.”

Article

The meaning of the Vietnam War has enduringly divided Americans in the postwar period. In part because the political splits opened up by the war made it an awkward topic for conversation, Vietnam veterans felt a barrier of silence separating them from their fellow citizens. The situation of returning veterans in the war’s waning years serves as a baseline against which to measure subsequent attempts at their social reintegration. Veterans, as embodiments of the experience of the war, became vehicles through which American society could assimilate its troubled and troubling memories. By the 1980s, greater public understanding of the difficulties of veterans’ homecoming experiences—particularly after the recognition in 1980 of the psychiatric condition, post-traumatic stress disorder (PTSD)—helped accelerate the efforts to recognize the service and sacrifices of Americans who fought in Vietnam through the creation of memorials. Because the homecoming experience was seen as crucial to the difficulties which a substantial minority suffered, the concept emerged that the nation needed to embrace its veterans in order to help restore their well-being. Characteristic ways of talking about the veterans’ experiences coalesced into truisms and parables: the nation and its veterans needed to “reconcile” and “heal”; America must “never again” send young men to fight a war unless the government goes all-out for victory; protesters spat on the veterans and called them “baby killers” when they returned from Vietnam. Strategists debated what the proper “lessons” of the Vietnam War were and how they should be applied to other military interventions. After the prevalent “overwhelming force” doctrine was discarded in 2003 in the invasion of Iraq, new “lessons” emerged from the Vietnam War: first came the concept of “rapid decisive operations,” and then counterinsurgency came back into vogue. In these interrelated dimensions, American society and politics shaped the memory of the Vietnam War.

Article

The impact of LGBTQ (lesbian, gay, bisexual, transgender, and queer) issues on U.S. foreign relations is an understudied area, and only a handful of historians have addressed these issues in articles and books. Encounters with unexpected and condemnable (to European eyes) sexual behaviors and gender comportment arose from the first European forays into North America. As such, subduing heterodox sexual and gender expression has always been part of the colonizing endeavor in the so-called New World, tied in with the mission of civilizing and Christianizing the indigenous peoples that was so central to the forging of the United States and pressing its territorial expansion across the continent. These same impulses accompanied the further U.S. accumulation of territory across the Pacific and the Caribbean in the late 19th century, and they persisted even longer and further afield in its citizens’ missionary endeavors across the globe. During the 20th century, as the state’s foreign policy apparatus grew in size and scope, so too did the notions of homosexuality and transgender identity solidify as widely recognizable identity categories in the United States. Thus, it is during the 20th and 21st centuries, with ever greater intensity as the decades progressed, that one finds important influences of homosexuality and gender diversity on U.S. foreign policy: in immigration policies dating back to the late 19th century, in the Lavender Scare that plagued the State Department during the Truman and Eisenhower presidencies, in more contemporary battles between religious conservatives and queer rights activists that have at times been exported to other countries, and in the increasing intersections of LGBTQ rights issues and the War on Terror that has been waged primarily in the Middle East since September 11, 2001.

Article

Jennifer Hoyt

Relations between the United States and Argentina can be best described as a cautious embrace punctuated by moments of intense frustration. Although never the center of U.S.–Latin American relations, Argentina has attempted to create a position of influence in the region. As a result, the United States has worked with Argentina and other nations of the Southern Cone—the region of South America that comprises Uruguay, Paraguay, Argentina, Chile, and southern Brazil—on matters of trade and economic development as well as hemispheric security and leadership. While Argentina has attempted to assert its position as one of Latin America’s most developed nations and therefore a regional leader, the equal partnership sought from the United States never materialized for the Southern Cone nation. Instead, competition for markets and U.S. interventionist and unilateral tendencies kept Argentina from attaining the influence and wealth it so desired. At the same time, the United States saw Argentina as an unreliable ally too sensitive to the pull of its volatile domestic politics. The two nations enjoyed moments of cooperation in World War I, the Cold War, and the 1990s, when Argentine leaders could balance this particular external partnership with internal demands. Yet at these times Argentine leaders found themselves walking a fine line as detractors back home saw cooperation with the United States as a violation of their nation’s sovereignty and autonomy. There has always been potential for a productive partnership, but each side’s intransigence and unique concerns limited this relationship’s accomplishments and led to a historical imbalance of power.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.