You are looking at 281-300 of 380 articles
H. Paul Thompson Jr.
The temperance and prohibition movement—a social reform movement that pursued many approaches to limit or prohibit the use and/or sale of alcoholic beverages—is arguably the longest-running reform movement in US history, extending from the 1780s through the repeal of national prohibition in 1933. During this 150-year period the movement experienced many ideological, organizational, and methodological changes. Probably the most widely embraced antebellum reform, many of its earliest assumptions and much of its earliest literature was explicitly evangelical, but over time the movement assumed an increasingly secular image while retaining strong ties to organized religion. During the movement’s first fifty years, its definition of temperance evolved successively from avoiding drunkenness, to abstaining from all distilled beverages, to abstaining from all intoxicating beverages (i.e., “teetotalism”). During these years, reformers sought merely to persuade others of their views—what was called “moral suasion.” But by the 1840s many reformers began seeking the coercive power of local and state governments to prohibit the “liquor traffic.” These efforts were called “legal suasion,” and in the early 20th century, when local and state laws were deemed insufficient, movement leaders turned to the federal government. Throughout its history, movement leaders produced an extensive and well-preserved serial and monographic literature to chronicle their efforts, which makes the movement relatively easy to study.
No less than five national temperance organizations rose and fell across the movement’s history, aided by many other organizations also promoted the message with great effect. Grass roots reformers organized innumerable state and local temperance societies and fraternal lodges committed to abstinence. Temperance reformers, hailing from nearly every conceivable demographic, networked through a series of national and international temperance conventions, and at any given time were pursuing a diverse and often conflicting array of priorities and methodologies.
Finally, during the Progressive Era, reformers focused their hatred for alcohol almost exclusively on saloons and the liquor traffic. Through groundbreaking lobbying efforts and a fortuitous convergence of social and political forces, reformers witnessed the ratification of the Eighteenth Amendment in January 1919 that established national prohibition. Despite such a long history of reform, the success seemed sudden and caught many in the movement off guard. The rise of liquor-related violence, a transformation in federal-state relations, increasingly organized and outspoken opposition, the Great Depression, and a re-alignment of political party coalitions all culminated in the sweeping repudiation of prohibition and its Republican supporters in the 1932 presidential election. On December 5, 1933, the Twenty-first Amendment to the Constitution repealed the Eighteenth Amendment, returning liquor regulation to the states, which have since maintained a wide variety of ever changing laws controlling the sale of alcoholic beverages. But national prohibition permanently altered the federal government’s role in law enforcement, and its legacy remains.
Brian J. McCammack
Urban areas have been the main source of pollution for centuries. The United States is no exception to this more general rule. Pollution of air, water, and soil only multiplied as cities grew in size and complexity; people generated ever more domestic waste and industry continually generated new unwanted byproducts. Periods of pollution intensification—most notably those spurts that came with late 19th-century urban industrialization and the rapid technological innovation and consumer culture of the post-World War II era—spurred social movements and scientific research on the problem, mostly as it pertained to adverse impacts on human health. Technological innovations aimed to eliminate unwanted wastes and more stringent regulations followed. Those technological and political solutions largely failed to keep pace with the increasing volume and diversity of pollutants industrial capitalism introduced into the environment, however, and rarely stopped pollution at its root cause. Instead, they often merely moved pollutants from one “sink”—a repository of pollution—to another (from water to land, for instance) and/or from one place to another (to a city downstream, for instance, or from one urban neighborhood to another).
This “end of pipe” approach remained overwhelmingly predominant even as most pollution mitigation policies became nationalized in the 1970s. Prior to that, municipalities and states were primarily responsible for addressing air, water, and land pollution. During this post-World War II period, policy—driven by ecological science—began to exhibit an understanding of urban pollution’s detrimental effects beyond human health. More broadly, evolving scientific understanding of human health and ecosystemic impacts of pollution, new technology, and changing social relations within growing metropolitan areas shifted the public perception of pollution’s harmful impacts. Scientific understanding of how urban and suburban residents risked ill health when exposed to polluted water, air, and soil grew, as did the social understanding of who was most vulnerable to these hazards. From the nation’s founding, the cumulative impact of both urban exposure to pollutants and attempts to curb that exposure has been unequal along lines of race and ethnicity, class, and gender. Despite those consistent inequalities, the 21st-century American city looks little like the 18th-century American city, whether in terms of population size, geographical footprint, demographics, economic activity, or the policies that governed them: all of these factors influenced the very definitions of ideas such as pollution and the urban.
Ross A. Kennedy
World War I profoundly affected the United States. It led to an expansion of America’s permanent military establishment, a foreign policy focused on reforming world politics, and American preeminence in international finance. In domestic affairs, America’s involvement in the war exacerbated class, racial, and ethnic conflict. It also heightened both the ethos of voluntarism in progressive ideology and the progressive desire to step up state intervention in the economy and society. These dual impulses had a coercive thrust that sometimes advanced progressive goals of a more equal, democratic society and sometimes repressed any perceived threat to a unified war effort. Ultimately the combination of progressive and repressive coercion undermined support for the Democratic Party, shifting the nation’s politics in a conservative direction as it entered the 1920s.
In the decade after 1965, radicals responded to the alienating features of America’s technocratic society by developing alternative cultures that emphasized authenticity, individualism, and community. The counterculture emerged from a handful of 1950s bohemian enclaves, most notably the Beat subcultures in the Bay Area and Greenwich Village. But new influences shaped an eclectic and decentralized counterculture after 1965, first in San Francisco’s Haight-Ashbury district, then in urban areas and college towns, and, by the 1970s, on communes and in myriad counter-institutions. The psychedelic drug cultures around Timothy Leary and Ken Kesey gave rise to a mystical bent in some branches of the counterculture and influenced counterculture style in countless ways: acid rock redefined popular music; tie dye, long hair, repurposed clothes, and hip argot established a new style; and sexual mores loosened. Yet the counterculture’s reactionary elements were strong. In many counterculture communities, gender roles mirrored those of mainstream society, and aggressive male sexuality inhibited feminist spins on the sexual revolution. Entrepreneurs and corporate America refashioned the counterculture aesthetic into a marketable commodity, ignoring the counterculture’s incisive critique of capitalism. Yet the counterculture became the basis of authentic “right livelihoods” for others. Meanwhile, the politics of the counterculture defy ready categorization. The popular imagination often conflates hippies with radical peace activists. But New Leftists frequently excoriated the counterculture for rejecting political engagement in favor of hedonistic escapism or libertarian individualism. Both views miss the most important political aspects of the counterculture, which centered on the embodiment of a decentralized anarchist bent, expressed in the formation of counter-institutions like underground newspapers, urban and rural communes, head shops, and food co-ops. As the counterculture faded after 1975, its legacies became apparent in the redefinition of the American family, the advent of the personal computer, an increasing ecological and culinary consciousness, and the marijuana legalization movement.
During the Cold War, the United States and the Soviet Union each sought to portray their way of organizing society—liberal democracy or Communism, respectively—as materially and morally superior. In their bids for global leadership, each sponsored “front” groups that defended their priorities and values to audiences around the world. These campaigns frequently enrolled artists and intellectuals, whose lives, works, and prestige could be built up, torn down, exploited, or enhanced through their participation in these groups. Alongside overt diplomatic efforts, the United States funded a number of organizations secretly through the Central Intelligence Agency (CIA). These efforts are often described as belonging to the “Cultural Cold War,” although the programs in fact supported overlapping networks that did anti-Communist work among labor unions, students, and others in addition to artists and intellectuals. The major CIA-sponsored group of intellectuals was the Congress for Cultural Freedom, established in 1950, and the “freedom” in its name was the major concept deployed by United States–aligned propagandists, to emphasize their differences from totalitarianism. The Cultural Cold War, as a program of psychological warfare conducted by the US government, grew out of the intersecting experiences of the left in the 1930s and the security apparatus of the United States at the dawn of the Cold War. The covert nature of the programs allowed them to evade scrutiny from the US Congress, and therefore to engage in activities that might otherwise have been stopped: working with people with radical political biographies or who still identified as “socialists,” or sponsoring avant-garde art, such as abstract expressionist painting. The programs spanned the globe, and grew in scope and ambition until their exposure in 1967. Subsequently, the United States has developed other mechanisms, such as the National Endowment for Democracy, to promote organizations within civil society that support its interests.
Founded in Philadelphia in 1869, the Noble and Holy Order of the Knights of Labor became the largest and most powerful labor organization that had ever existed in the United States by the mid-1880s. Recruiting men and women of nearly all occupations and all races (except Chinese), the Knights tried to reform American capitalism and politics in ways that would curb the growing economic and political abuses and excesses of the Gilded Age. Leaders of the organization viewed strikes as harmful to workers and employers alike, especially after the Great Railroad Strike of 1877, but a series of railroad strikes in 1884 and 1885 caused the Knights’ membership rolls to reach a peak of at least 700,000 in 1886.
The heyday of the Knights of Labor proved brief though. Two major events in May 1886, the Haymarket riot in Chicago and the failure of a strike against Jay Gould’s Southwestern Railway system, began a series of setbacks that caused the organization to decline about as rapidly as it had arisen. By 1893, membership dropped below 100,000, and the Knights’ leaders aligned the organization with the farmers’ movement and the Populist Party. The Knights increasingly became a rural organization, as urban skilled and semi-skilled workers joined trade unions affiliated with the American Federation of Labor (AFL). The AFL, however, proved less inclusive and egalitarian than the Knights of Labor, although some of the latter’s ideals would be carried on by later organizations such as the Industrial Workers of the World and the Congress of Industrial Organizations.
Anne L. Foster
The beginning of modern war on drugs in the United States is commonly credited to President Richard Nixon, who evoked fears of crime, degenerate youth, and foreign drugs to garner support for his massive, by early 1970s standards, effort to combat drugs in the United States. Scholars now agree, however, that the essential characteristics of the “war on drugs” stretched back to the early 20th century. The first federal law to prohibit a narcotic in the United States passed in 1909 and banned the import of “smoking opium.” Although opium itself remained legal, opium prepared for smoking—a form believed to be consumed predominantly by ethnic Chinese and imported into the United States—was not. All future anti-narcotics policies drew on these foundational notions: narcotics were of foreign origin and invaded the United States. Thus, interdiction efforts at U.S. borders, and increasingly in producer countries, were an appropriate response. Narcotics consumers were presented as equally threatening, viewed as foreigners or at the margins of American society, and U.S. lawmakers therefore criminalized both drug use and drug trafficking. With drugs as well as drug users defined as foreign threats, militarization of the efforts to prohibit drugs followed. In U.S. drug policy, there is no distinction between foreign and domestic policy. They are intertwined at all levels, including the definition of the problem, the origin of many drugs, and the sites of enforcement.
Theodore Roosevelt played a seminal role in the rise of the United States to Great Power status at the turn of the 20th century and in debates about World War I and the League of Nations. Prior to entering the White House, TR was a leading proponent of a more ambitious foreign policy. As the 26th president he promoted US predominance in the Western Hemisphere, engaged in Great Power diplomacy, and oversaw expansion of the navy. He also laid the foundations for modern presidential statecraft with forceful advocacy of specific policy goals, a close relationship with the press, and an intense engagement with public opinion. After leaving Washington, he was among the most ardent critics of president Woodrow Wilson’s policies and helped to build support for the Allies and for preparing to enter what would become the “Great War,” or World War I. At the time of his death, he was a leading contender for the Republican presidential nomination.
Scholarly and public surveys frequently rank Roosevelt among the most successful presidents, especially in the realm of foreign policy. His influence can be observed in successors as diverse as Wilson, Franklin D. Roosevelt, Ronald Reagan, and Barack Obama. Yet historians have also scrutinized his views on race, gender, imperialism, and violence, many of which appear outdated or problematic from an early-21st-century perspective. Also troubling was Roosevelt’s demonization of antiwar activists during World War I and his sometimes heavy-handed attempts to promote loyalty among citizens of German or Irish descent.
One of the pervasive myths about the United States is that it has never had a socialist movement comparable to other industrialized nations. Yet in the early 20th century a vibrant Socialist Party and socialist movement flourished in the United States. Created in 1901, the Socialist Party of America unsurprisingly declared its primary goal to be the collectivization of the means of production. Yet the party’s highly decentralized and democratic structure enabled it to adapt to the needs and cultures of diverse constituencies in different regions of the country. Among those attracted to the movement in its heyday were immigrant and native-born workers and their families, tenant farmers, middle-class intellectuals, socially conscious millionaires, urban reformers, and feminists. Party platforms regularly included the reform interests of these groups as well as the long-term goal of eradicating capitalism. By 1912, the Socialist Party boasted an impressive record of electoral successes at the local, state, and national levels. U.S. Socialists could also point with pride to over three hundred English and foreign-language Socialist periodicals, some with subscription rates that rivaled those of the major urban daily newspapers.
Yet Socialists faced numerous challenges in their efforts to build a viable third-party movement in the United States. On the one hand, progressive reformers in the Democratic and Republican parties sought to coopt Socialists. On the other hand, the Socialist Party encountered challenges on the left from anarchists, syndicalists, communists, and Farmer-Labor Party activists. The Socialist Party was particularly weakened by government repression during World War I, by the postwar Red Scare, and by a communist insurgency within its ranks in the aftermath of the war. By the onset of the Great Depression, the Communist Party would displace the Socialist Party as the leading voice of radical change in the United States.
Ted R. Bromund
The Special Relationship is a term used to describe the close relations between the United States and the United Kingdom. It applies particularly to the governmental realms of foreign, defense, security, and intelligence policy, but it also captures a broader sense that both public and private relations between the United States and Britain are particularly deep and close. The Special Relationship is thus a term for a reality that came into being over time as the result of political leadership as well as ideas and events outside the formal arena of politics.
After the political break of the American Revolution and in spite of sporadic cooperation in the 19th century, it was not until the Great Rapprochement of the 1890s that the idea that Britain and the United States had a special kind of relationship took hold. This decade, in turn, created the basis for the Special Relationship, a term first used by Winston Churchill in 1944. Churchill did the most to build the relationship, convinced as he was that close friendship between Britain and the United States was the cornerstone of world peace and prosperity. During and after the Second World War, many others on both sides of the Atlantic came to agree with Churchill.
The post-1945 era witnessed a flowering of the relationship, which was cemented—not without many controversies and crises—by the emerging Cold War against the Soviet Union. After the end of the Cold War in 1989, the relationship remained close, though it was severely tested by further security crises, Britain’s declining defense spending, the evolving implications of Britain’s membership in the European Union, the relative decline of Europe, and an increasing U.S. interest in Asia. Yet on many public and private levels, relations between the United States and Britain continue to be particularly deep, and thus the Special Relationship endures.
Charles M. Payne
The only youth-led national civil rights organization in the 1960s in the United States, the Student Nonviolent Coordinating Committee (SNCC), grew out of sit-ins, with the base of its early membership coming from Black colleges. It became one of the most militant civil rights groups, pushing older organizations to become more aggressive. Under the tutelage of the experienced activist Ella Baker, it emphasized developing leadership in “ordinary” people. Its early years were dominated by direct action campaigns against White supremacy in the urban and Upper South, while internally, SNCC strove to actualize the Beloved Community. Later it specialized in grassroots community organizing and voter registration in dangerous areas of the Deep South. Its Freedom Summer campaign played a significant role in radicalizing young activists. SNCC, in general, acted as a training ground and model for other forms of youth activism. Notwithstanding its own issues with chauvinism, SNCC was open to leadership from women in a way that few social change organizations of the time were.
Mary S. Barton and David M. Wight
The US government’s perception of and response to international terrorism has undergone momentous shifts since first focusing on the issue in the early 20th century. The global rise of anarchist and communist violence provided the impetus for the first major US government programs aimed at combating international terrorism: restrictive immigration policies targeting perceived radicals. By the 1920s, the State Department emerged as the primary government agency crafting US responses to international terrorism, generally combating communist terrorism through diplomacy and information-sharing partnerships with foreign governments. The 1979 Iranian hostage crisis marked the beginning of two key shifts in US antiterrorism policy: a heightened focus on combating Islamist terrorism and a willingness to deploy military force to this end. The terrorist attacks of September 11, 2001, led US officials to conceptualize international terrorism as a high-level national security problem, leading to US military invasions and occupations of Afghanistan and Iraq, a broader use of special forces, and unprecedented intelligence-gathering operations.
Don H. Doyle
America’s Civil War became part of a much larger international crisis as European powers, happy to see the experiment in self-government fail in America’s “Great Republic,” took advantage of the situation to reclaim former colonies in the Caribbean and establish a European monarchy in Mexico. Overseas, in addition to their formal diplomatic appeals to European governments, both sides also experimented with public diplomacy campaigns to influence public opinion. Confederate foreign policy sought to win recognition and aid from Europe by offering free trade in cotton and aligning their cause with that of the aristocratic anti-democratic governing classes of Europe. The Union, instead, appealed to liberal, republican sentiment abroad by depicting the war as a trial of democratic government and embracing emancipation of the slaves. The Union victory led to the withdrawal of European empires from the New World: Spain from Santo Domingo, France from Mexico, Russia from Alaska, and Britain from Canada, and the destruction of slavery in the United States hastened its end in Puerto Rico, Cuba, and Brazil.
Francis D. Cogliano
Thomas Jefferson was a key architect of early American foreign policy. He had a clear vision of the place of the new republic in the world, which he articulated in a number of writings and state papers. The key elements to his strategic vision were geographic expansion and free trade. Throughout his long public career Jefferson sought to realize these ends, particularly during his time as US minister to France, secretary of state, vice president, and president. He believed that the United States should expand westward and that its citizens should be free to trade globally. He sought to maintain the right of the United States to trade freely during the wars arising from the French Revolution and its aftermath. This led to his greatest achievement, the Louisiana Purchase, but also to conflicts with the Barbary States and, ultimately, Great Britain. He believed that the United States should usher in a new world of republican diplomacy and that it would be in the vanguard of the global republican movement. In the literature on US foreign policy, historians have tended to identify two main schools of practice dividing practitioners into idealists and realists. Jefferson is often regarded as the founder of the idealist tradition. This somewhat misreads him. While he pursued clear idealistic ends—a world dominated by republics freely trading with each other—he did so using a variety of methods including diplomacy, war, and economic coercion.
Blake C. Scott
Tourism is so deep-seated in the history of U.S. foreign relations we seem to have taken its presence for granted. Millions of American tourists have traveled abroad, yet one can count with just two hands the number of scholarly monographs analyzing the relationship between U.S. foreign relations and tourism. What explains this lack of historical reflection about one of the most quotidian forms of U.S. influence abroad?
In an influential essay about wilderness and the American frontier, the environmental historian William Cronon argues, “one of the most striking proofs of the cultural invention of wilderness is its thoroughgoing erasure of the history from which it sprang.” Historians and the American public, perhaps in modern fashion, have overlooked tourism’s role in the nation’s international affairs. Only a culture and a people so intimately familiar with tourism’s practices could naturalize them out of history.
The history of international tourism is profoundly entangled with the history of U.S. foreign policy. This entanglement has involved, among other things, science and technology, military intervention, diplomacy, and the promotion of consumer spending abroad. U.S. expansion created the structure (the social stability, medical safety, and transportation infrastructure) for globetrotting travel in the 20th century. As this essay shows, U.S. foreign policy was crucial in transforming foreign travel into a middle-class consumer experience.
David M. Robinson
New England transcendentalism is the first significant literary movement in American history, notable principally for the influential works of Ralph Waldo Emerson, Margaret Fuller, and Henry David Thoreau. The movement emerged in the 1830s as a religious challenge to New England Unitarianism. Building on the writings of the Unitarian leader William Ellery Channing, Emerson and others such as Frederic Henry Hedge, George Ripley, James Freeman Clarke, and Theodore Parker developed a theology based on interior, intuitive experience rather than the historical truth of the Bible. By 1836 transcendentalist books from several important religious thinkers began to appear, including Emerson’s Nature, which employed idealist philosophy and Romantic symbolism to examine human interaction with the natural world. Emerson’s Harvard addresses, “The American Scholar” (1837) and the controversial “Divinity School Address” (1838), gave transcendental ideas a wider prominence, and also generated strong resistance that added an element of experiment and danger to the movement’s reputation. In 1840 the transcendentalists founded a journal for their work, and Fuller became the Dial’s first editor, a position that gave her an important role in the movement and a crucial outlet for her own work in literary criticism and women’s rights.
Though it had begun as a religious movement, by the middle 1840s transcendentalism could be better described as a literary movement with growing political engagements on several fronts. Emerson proclaimed it as an era of reform and aligned the transcendentalists with those who resisted the social and political status quo. In her feminist manifesto Woman in the Nineteenth Century (1845), Fuller called for the removal of both legal and social barriers to women’s full potential. In 1845 Henry David Thoreau went to live in the woods by Walden Pond; his memoir of his experience, Walden (1854), became a founding text of modern environmental thinking. Antislavery also became a key concern for many of the transcendentalists, who condemned the Fugitive Slave Act of 1850 and actively resisted the execution of the law after its passage. The transcendentalists, a nineteenth-century cultural avant-garde, continue to exert cultural influence through the durability of their writings, works that shaped many aspects of American national development.
Paul D. Miller
Afghanistan has twice been thrust front and center of US national security concerns in the past half-century: first, during the Soviet-Afghan War, when Afghanistan served as a proxy for American efforts to combat Soviet influence; and second, as the frontline state and host for America’s global response to al-Qaida’s terrorist attacks of 2001. In both instances, American involvement swung from intensive investment and engagement to withdrawal and neglect. In both cases, American involvement reflected US concerns more than Afghan realities. And both episodes resulted in short-term successes for American security with long-term consequences for Afghanistan and its people. The signing of a strategic partnership agreement between the two countries in 2012 and a bilateral security agreement in 2013 created the possibility of a steadier and more forward-looking relationship—albeit one that the American and Afghan people may be less inclined to pursue as America’s longest war continues to grind on.
Gregg A. Brazinsky
Throughout the 19th and 20th centuries, America’s relationship with China ran the gamut from friendship and alliance to enmity and competition. Americans have long believed in China’s potential to become an important global actor, primarily in ways that would benefit the United States. The Chinese have at times embraced, at times rejected, and at times adapted to the US agenda. While there have been some consistent themes in this relationship, Sino-American interactions unquestionably increased their breadth in the 20th century. Trade with China grew from its modest beginnings in the 19th and early 20th centuries into a critical part of the global economy by the 21st century. While Americans have often perceived China as a country that offered significant opportunities for mutual benefit, China has also been seen as a threat and rival. During the Cold War, the two competed vigorously for influence in Asia and Africa. Today we see echoes of this same competition as China continues to grow economically while expanding its influence abroad. The history of Sino-American relations illustrates a complex dichotomy of cooperation and competition; this defines the relationship today and has widespread ramifications for global politics.
Although the League of Nations was the first permanent organization established with the purpose of maintaining international peace, it built on the work of a series of 19th-century intergovernmental institutions. The destructiveness of World War I led American and British statesmen to champion a league as a means of maintaining postwar global order. In the United States, Woodrow Wilson followed his predecessors, Theodore Roosevelt and William Howard Taft, in advocating American membership of an international peace league, although Wilson’s vision for reforming global affairs was more radical. In Britain, public opinion had begun to coalesce in favor of a league from the outset of the war, though David Lloyd George and many of his Cabinet colleagues were initially skeptical of its benefits. However, Lloyd George was determined to establish an alliance with the United States and warmed to the league idea when Jan Christian Smuts presented a blueprint for an organization that served that end.
The creation of the League was a predominantly British and American affair. Yet Wilson was unable to convince Americans to commit themselves to membership in the new organization. The Franco-British-dominated League enjoyed some early successes. Its high point was reached when Europe was infused with the “Spirit of Locarno” in the mid-1920s and the United States played an economically crucial, if politically constrained, role in advancing Continental peace. This tenuous basis for international order collapsed as a result of the economic chaos of the early 1930s, as the League proved incapable of containing the ambitions of revisionist powers in Europe and Asia. Despite its ultimate limitations as a peacekeeping body, recent scholarship has emphasized the League’s relative successes in stabilizing new states, safeguarding minorities, managing the evolution of colonies into notionally sovereign states, and policing transnational trafficking; in doing so, it paved the way for the creation of the United Nations.
For almost a century and a half, successive American governments adopted a general policy of neutrality on the world stage, eschewing involvement in European conflicts and, after the Quasi War with France, alliances with European powers. Neutrality, enshrined as a core principle of American foreign relations by the outgoing President George Washington in 1796, remained such for more than a century.
Finally, in the 20th century, the United States emerged as a world power and a belligerent in the two world wars and the Cold War. This article explores the modern conflict between traditional American attitudes toward neutrality and the global agenda embraced by successive U.S. governments, beginning with entry in the First World War. With the United States immersed in these titanic struggles, the traditional U.S. support for neutrality eroded considerably. During the First World War, the United States showed some sympathy for the predicaments of the remaining neutral powers. In the Second World War it applied considerable pressure to those states still trading with Germany. During the Cold War, the United States was sometimes impatient with the choices of states to remain uncommitted in the global struggle, while at times it showed understanding for neutrality and pursued constructive relations with neutral states. The wide varieties of neutrality in each of these conflicts complicated the choices of U.S. policy makers. Americans remained torn between memory of their own long history of neutrality and a capacity to understand its potential value, on one hand, and a predilection to approach conflicts as moral struggles, on the other.