You are looking at 241-260 of 326 articles
From the 1890s to World War I, progressive reformers in the United States called upon their local, state, and federal governments to revitalize American democracy and address the most harmful social consequences of industrialization. The emergence of an increasingly powerful administrative state, which intervened on behalf of the public welfare in the economy and society, generated significant levels of conflict. Some of the opposition came from conservative business interests, who denounced state labor laws and other market regulations as meddlesome interferences with liberty of contract. But the historical record of the Progressive Era also reveals a broad undercurrent of resistance from ordinary Americans, who fought for personal liberty against the growth of police power in such areas as public health administration and the regulation of radical speech. Their struggles in the streets, statehouses, and courtrooms of the United States in the early 20th century shaped the legal culture of the period and revealed the contested meaning of individual liberty in a new social age.
Ann Durkin Keating
Since the beginning of the 19th century, outlying areas of American cities have been home to a variety of settlements and enterprises with close links to urban centers. Beginning in the early 19th century, the increasing scale of business and industrial enterprises separated workplaces from residences. This allowed some urban dwellers to live at a distance from their place of employment and commute to work. Others lived in the shadow of factories located at some distance from the city center. Still others provided food or raw materials for urban residents and businesses. The availability of employment led to further suburban growth. Changing intracity transportation, including railroads, interurbans, streetcars, and cable cars, enabled people and businesses to locate beyond the limits of a walking city.
By the late 19th century, metropolitan areas across the United States included outlying farm centers, industrial towns, residential rail (or streetcar) suburbs, and recreational/institutional centers. With suburbs generally located along rail or ferry lines into the early 20th century, the physical development of metropolitan areas often resembled a hub and spokes. However, across metropolitan regions, suburbs had a great range of function and diversity of populations. With the advent of automobile commutation and the growing use of trucks to haul freight, suburban development took place between railroad lines, filling in the earlier hub-and-spokes patterns into a more deliberate built-up area.
Although suburban settlements were integrally connected to their neighbors and within a metropolitan economy and society, independent suburban governments emerged to serve these outlying settlements and keep them separate. Developers often took the lead in providing differential services (and regulations). Suburban governments emerged as hybrid forms, serving relatively homogeneous populations by providing only some urban functions. Well before 1945, suburbs were home to a wide range of work and residents.
Becky Nicolaides and Andrew Wiese
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.
Since the turn of the 20th century, teachers have tried to find a balance between bettering their own career prospects as workers and educating their students as public servants. To reach a workable combination, teachers have utilized methods drawn from union movements, the militant and labor-conscious approach favored by the American Federation of Teachers (AFT), as well as to professional organizations, the tradition from which the National Education Association (NEA) arose. Because teachers lacked the federally guaranteed labor rights that private-sector workers enjoyed after Congress passed the National Labor Relations Act in 1935, teachers’ fortunes—in terms of collective bargaining rights, control over classroom conditions, pay, and benefits—often remained tied to the broader public-sector labor movement and to state rather than federal law.
Opponents of teacher unionization consistently charged that as public servants paid by tax revenues, teachers and other public employees should not be allowed to form unions. Further, because women constituted the vast majority of teachers and union organizing often represented a “manly” domain, the opposition’s approach worked quite well, successfully preventing teachers from gaining widespread union recognition. But by the late 1960s and early 1970s, thanks to an improved economic climate and invigoration from the women’s movement, civil rights struggles, and the New Left, both AFT and NEA teacher unionism surged forward, infused with a powerful militancy devoted to strikes and other political action, and appeared poised to capture federal collective bargaining rights. Their newfound assertiveness proved ill-timed, however.
After the economic problems of the mid-1970s, opponents of teacher unions once again seized the opportunity to portray teacher unions and other public-sector unions as greedy and privileged interest groups functioning at the public’s expense. President Ronald Reagan accentuated this point when he fired all of the more than 10,000 striking air traffic controllers during the 1981 Professional Air Traffic Controllers Organization (PATCO) strike. Facing such opposition, teacher unions—and public-sector unions in general—shifted their efforts away from strikes and toward endorsing political candidates and lobbying governments to pass favorable legislation.
Given these constraints, public-sector unions enjoyed a large degree of success in the 1990s through the early 2000s, even as private-sector union membership plunged to less than 10 percent of the workforce. After the Great Recession of 2008, however, austerity politics targeted teachers and other public-sector workers and renewed political confrontations surrounding the legitimacy of teacher unions.
Timothy James LeCain
Technology and environmental history are both relatively young disciplines among Americanists, and during their early years they developed as distinctly different and even antithetical fields, at least in topical terms. Historians of technology initially focused on human-made and presumably “unnatural” technologies, whereas environmental historians focused on nonhuman and presumably “natural” environments. However, in more recent decades, both disciplines have moved beyond this oppositional framing. Historians of technology increasingly came to view anthropogenic artifacts such as cities, domesticated animals, and machines as extensions of the natural world rather than its antithesis. Even the British and American Industrial Revolutions constituted not a distancing of humans from nature, as some scholars have suggested, but rather a deepening entanglement with the material environment. At the same time, many environmental historians were moving beyond the field’s initial emphasis on the ideal of an American and often Western “wilderness” to embrace a concept of the environment as including humans and productive work. Nonetheless, many environmental historians continued to emphasize the independent agency of the nonhuman environment of organisms and things. This insistence that not everything could be reduced to human culture remained the field’s most distinctive feature.
Since the turn of millennium, the two fields have increasingly come together in a variety of synthetic approaches, including Actor Network Theory, envirotechnical analysis, and neomaterialist theory. As the influence of the cultural turn has waned, the environmental historians’ emphasis on the independent agency of the nonhuman has come to the fore, gaining wider influence as it is applied to the dynamic “nature” or “wildness” that some scholars argue exists within both the technological and natural environment. The foundational distinctions between the history of technology and environmental history may now be giving way to more materially rooted attempts to understand how a dynamic hybrid environment helps to create human history in all of its dimensions—cultural, social, and biological.
Described as a “chief among chiefs” by the British, and by his arch-rival, William Henry Harrison, as “one of those uncommon geniuses which spring up occasionally to produce revolutions and overturn the established order of things,” Tecumseh impressed all who knew him. Lauded for his oratory, military and diplomatic skills, and, ultimately, his humanity, Tecumseh presided over the greatest Indian resistance movement that had ever been assembled in the eastern half of North America. His genius lay in his ability to fully articulate religious, racial, and cultural ideals borne out of his people’s existence on fault lines between competing empires and Indian confederacies. Known as “southerners” by their Algonquian relatives, the Shawnees had a history of migrating between worlds. Tecumseh, and his brother, Tenskwatawa, converted this inheritance into a widespread social movement in the first decade and a half of the 19th-century, when more than a thousand warriors, from many different tribes, heeded their call to halt American expansion along the border of what is now Ohio and Indiana. Tecumseh articulated a vision of intertribal, pan-Indian unity based on revitalization and reform, and his ambitions very nearly rewrote early American history.
H. Paul Thompson Jr.
The temperance and prohibition movement—a social reform movement that pursued many approaches to limit or prohibit the use and/or sale of alcoholic beverages—is arguably the longest-running reform movement in US history, extending from the 1780s through the repeal of national prohibition in 1933. During this 150-year period the movement experienced many ideological, organizational, and methodological changes. Probably the most widely embraced antebellum reform, many of its earliest assumptions and much of its earliest literature was explicitly evangelical, but over time the movement assumed an increasingly secular image while retaining strong ties to organized religion. During the movement’s first fifty years, its definition of temperance evolved successively from avoiding drunkenness, to abstaining from all distilled beverages, to abstaining from all intoxicating beverages (i.e., “teetotalism”). During these years, reformers sought merely to persuade others of their views—what was called “moral suasion.” But by the 1840s many reformers began seeking the coercive power of local and state governments to prohibit the “liquor traffic.” These efforts were called “legal suasion,” and in the early 20th century, when local and state laws were deemed insufficient, movement leaders turned to the federal government. Throughout its history, movement leaders produced an extensive and well-preserved serial and monographic literature to chronicle their efforts, which makes the movement relatively easy to study.
No less than five national temperance organizations rose and fell across the movement’s history, aided by many other organizations also promoted the message with great effect. Grass roots reformers organized innumerable state and local temperance societies and fraternal lodges committed to abstinence. Temperance reformers, hailing from nearly every conceivable demographic, networked through a series of national and international temperance conventions, and at any given time were pursuing a diverse and often conflicting array of priorities and methodologies.
Finally, during the Progressive Era, reformers focused their hatred for alcohol almost exclusively on saloons and the liquor traffic. Through groundbreaking lobbying efforts and a fortuitous convergence of social and political forces, reformers witnessed the ratification of the Eighteenth Amendment in January 1919 that established national prohibition. Despite such a long history of reform, the success seemed sudden and caught many in the movement off guard. The rise of liquor-related violence, a transformation in federal-state relations, increasingly organized and outspoken opposition, the Great Depression, and a re-alignment of political party coalitions all culminated in the sweeping repudiation of prohibition and its Republican supporters in the 1932 presidential election. On December 5, 1933, the Twenty-first Amendment to the Constitution repealed the Eighteenth Amendment, returning liquor regulation to the states, which have since maintained a wide variety of ever changing laws controlling the sale of alcoholic beverages. But national prohibition permanently altered the federal government’s role in law enforcement, and its legacy remains.
Brian J. McCammack
Urban areas have been the main source of pollution for centuries. The United States is no exception to this more general rule. Pollution of air, water, and soil only multiplied as cities grew in size and complexity; people generated ever more domestic waste and industry continually generated new unwanted byproducts. Periods of pollution intensification—most notably those spurts that came with late 19th-century urban industrialization and the rapid technological innovation and consumer culture of the post-World War II era—spurred social movements and scientific research on the problem, mostly as it pertained to adverse impacts on human health. Technological innovations aimed to eliminate unwanted wastes and more stringent regulations followed. Those technological and political solutions largely failed to keep pace with the increasing volume and diversity of pollutants industrial capitalism introduced into the environment, however, and rarely stopped pollution at its root cause. Instead, they often merely moved pollutants from one “sink”—a repository of pollution—to another (from water to land, for instance) and/or from one place to another (to a city downstream, for instance, or from one urban neighborhood to another).
This “end of pipe” approach remained overwhelmingly predominant even as most pollution mitigation policies became nationalized in the 1970s. Prior to that, municipalities and states were primarily responsible for addressing air, water, and land pollution. During this post-World War II period, policy—driven by ecological science—began to exhibit an understanding of urban pollution’s detrimental effects beyond human health. More broadly, evolving scientific understanding of human health and ecosystemic impacts of pollution, new technology, and changing social relations within growing metropolitan areas shifted the public perception of pollution’s harmful impacts. Scientific understanding of how urban and suburban residents risked ill health when exposed to polluted water, air, and soil grew, as did the social understanding of who was most vulnerable to these hazards. From the nation’s founding, the cumulative impact of both urban exposure to pollutants and attempts to curb that exposure has been unequal along lines of race and ethnicity, class, and gender. Despite those consistent inequalities, the 21st-century American city looks little like the 18th-century American city, whether in terms of population size, geographical footprint, demographics, economic activity, or the policies that governed them: all of these factors influenced the very definitions of ideas such as pollution and the urban.
Ross A. Kennedy
World War I profoundly affected the United States. It led to an expansion of America’s permanent military establishment, a foreign policy focused on reforming world politics, and American preeminence in international finance. In domestic affairs, America’s involvement in the war exacerbated class, racial, and ethnic conflict. It also heightened both the ethos of voluntarism in progressive ideology and the progressive desire to step up state intervention in the economy and society. These dual impulses had a coercive thrust that sometimes advanced progressive goals of a more equal, democratic society and sometimes repressed any perceived threat to a unified war effort. Ultimately the combination of progressive and repressive coercion undermined support for the Democratic Party, shifting the nation’s politics in a conservative direction as it entered the 1920s.
In the decade after 1965, radicals responded to the alienating features of America’s technocratic society by developing alternative cultures that emphasized authenticity, individualism, and community. The counterculture emerged from a handful of 1950s bohemian enclaves, most notably the Beat subcultures in the Bay Area and Greenwich Village. But new influences shaped an eclectic and decentralized counterculture after 1965, first in San Francisco’s Haight-Ashbury district, then in urban areas and college towns, and, by the 1970s, on communes and in myriad counter-institutions. The psychedelic drug cultures around Timothy Leary and Ken Kesey gave rise to a mystical bent in some branches of the counterculture and influenced counterculture style in countless ways: acid rock redefined popular music; tie dye, long hair, repurposed clothes, and hip argot established a new style; and sexual mores loosened. Yet the counterculture’s reactionary elements were strong. In many counterculture communities, gender roles mirrored those of mainstream society, and aggressive male sexuality inhibited feminist spins on the sexual revolution. Entrepreneurs and corporate America refashioned the counterculture aesthetic into a marketable commodity, ignoring the counterculture’s incisive critique of capitalism. Yet the counterculture became the basis of authentic “right livelihoods” for others. Meanwhile, the politics of the counterculture defy ready categorization. The popular imagination often conflates hippies with radical peace activists. But New Leftists frequently excoriated the counterculture for rejecting political engagement in favor of hedonistic escapism or libertarian individualism. Both views miss the most important political aspects of the counterculture, which centered on the embodiment of a decentralized anarchist bent, expressed in the formation of counter-institutions like underground newspapers, urban and rural communes, head shops, and food co-ops. As the counterculture faded after 1975, its legacies became apparent in the redefinition of the American family, the advent of the personal computer, an increasing ecological and culinary consciousness, and the marijuana legalization movement.
Founded in Philadelphia in 1869, the Noble and Holy Order of the Knights of Labor became the largest and most powerful labor organization that had ever existed in the United States by the mid-1880s. Recruiting men and women of nearly all occupations and all races (except Chinese), the Knights tried to reform American capitalism and politics in ways that would curb the growing economic and political abuses and excesses of the Gilded Age. Leaders of the organization viewed strikes as harmful to workers and employers alike, especially after the Great Railroad Strike of 1877, but a series of railroad strikes in 1884 and 1885 caused the Knights’ membership rolls to reach a peak of at least 700,000 in 1886.
The heyday of the Knights of Labor proved brief though. Two major events in May 1886, the Haymarket riot in Chicago and the failure of a strike against Jay Gould’s Southwestern Railway system, began a series of setbacks that caused the organization to decline about as rapidly as it had arisen. By 1893, membership dropped below 100,000, and the Knights’ leaders aligned the organization with the farmers’ movement and the Populist Party. The Knights increasingly became a rural organization, as urban skilled and semi-skilled workers joined trade unions affiliated with the American Federation of Labor (AFL). The AFL, however, proved less inclusive and egalitarian than the Knights of Labor, although some of the latter’s ideals would be carried on by later organizations such as the Industrial Workers of the World and the Congress of Industrial Organizations.
Anne L. Foster
The beginning of modern war on drugs in the United States is commonly credited to President Richard Nixon, who evoked fears of crime, degenerate youth, and foreign drugs to garner support for his massive, by early 1970s standards, effort to combat drugs in the United States. Scholars now agree, however, that the essential characteristics of the “war on drugs” stretched back to the early 20th century. The first federal law to prohibit a narcotic in the United States passed in 1909 and banned the import of “smoking opium.” Although opium itself remained legal, opium prepared for smoking—a form believed to be consumed predominantly by ethnic Chinese and imported into the United States—was not. All future anti-narcotics policies drew on these foundational notions: narcotics were of foreign origin and invaded the United States. Thus, interdiction efforts at U.S. borders, and increasingly in producer countries, were an appropriate response. Narcotics consumers were presented as equally threatening, viewed as foreigners or at the margins of American society, and U.S. lawmakers therefore criminalized both drug use and drug trafficking. With drugs as well as drug users defined as foreign threats, militarization of the efforts to prohibit drugs followed. In U.S. drug policy, there is no distinction between foreign and domestic policy. They are intertwined at all levels, including the definition of the problem, the origin of many drugs, and the sites of enforcement.
Theodore Roosevelt played a seminal role in the rise of the United States to Great Power status at the turn of the 20th century and in debates about World War I and the League of Nations. Prior to entering the White House, TR was a leading proponent of a more ambitious foreign policy. As the 26th president he promoted US predominance in the Western Hemisphere, engaged in Great Power diplomacy, and oversaw expansion of the navy. He also laid the foundations for modern presidential statecraft with forceful advocacy of specific policy goals, a close relationship with the press, and an intense engagement with public opinion. After leaving Washington, he was among the most ardent critics of president Woodrow Wilson’s policies and helped to build support for the Allies and for preparing to enter what would become the “Great War,” or World War I. At the time of his death, he was a leading contender for the Republican presidential nomination.
Scholarly and public surveys frequently rank Roosevelt among the most successful presidents, especially in the realm of foreign policy. His influence can be observed in successors as diverse as Wilson, Franklin D. Roosevelt, Ronald Reagan, and Barack Obama. Yet historians have also scrutinized his views on race, gender, imperialism, and violence, many of which appear outdated or problematic from an early-21st-century perspective. Also troubling was Roosevelt’s demonization of antiwar activists during World War I and his sometimes heavy-handed attempts to promote loyalty among citizens of German or Irish descent.
The “Chinese 49’ers” who arrived in the United States a decade before the American Civil War constituted the first large wave of Asian migrants to America and transplanted the first Asian cuisine to America. Chinese food was the first ethnic cuisine to be highly commodified at the national level as a type of food primarily to be prepared and consumed away from home. At the end of the 19th century, food from China began to attract a fast-growing non-Chinese clientele of diverse ethnic backgrounds in major cities across the nation, and by 1980 Chinese food had become the most popular ethnic cuisine in the United States, aided by a renewal of Chinese immigration to America. Chinese food also has been a vital economic lifeline for Chinese Americans as one of the two main sources of employment (laundries being the other) for Chinese immigrants and families for decades. Its development, therefore, is an important chapter in American history and a central part of the Chinese American experience.
The multiple and often divergent trends in the U.S. Chinese-food industry show that it is at a crossroads today. Its future hinges on the extent to which Chinese Americans can significantly alter their position in the social and political arena and on China’s ability to transform the economic equation in its relationship with the United States.
One of the pervasive myths about the United States is that it has never had a socialist movement comparable to other industrialized nations. Yet in the early 20th century a vibrant Socialist Party and socialist movement flourished in the United States. Created in 1901, the Socialist Party of America unsurprisingly declared its primary goal to be the collectivization of the means of production. Yet the party’s highly decentralized and democratic structure enabled it to adapt to the needs and cultures of diverse constituencies in different regions of the country. Among those attracted to the movement in its heyday were immigrant and native-born workers and their families, tenant farmers, middle-class intellectuals, socially conscious millionaires, urban reformers, and feminists. Party platforms regularly included the reform interests of these groups as well as the long-term goal of eradicating capitalism. By 1912, the Socialist Party boasted an impressive record of electoral successes at the local, state, and national levels. U.S. Socialists could also point with pride to over three hundred English and foreign-language Socialist periodicals, some with subscription rates that rivaled those of the major urban daily newspapers.
Yet Socialists faced numerous challenges in their efforts to build a viable third-party movement in the United States. On the one hand, progressive reformers in the Democratic and Republican parties sought to coopt Socialists. On the other hand, the Socialist Party encountered challenges on the left from anarchists, syndicalists, communists, and Farmer-Labor Party activists. The Socialist Party was particularly weakened by government repression during World War I, by the postwar Red Scare, and by a communist insurgency within its ranks in the aftermath of the war. By the onset of the Great Depression, the Communist Party would displace the Socialist Party as the leading voice of radical change in the United States.
Ted R. Bromund
The Special Relationship is a term used to describe the close relations between the United States and the United Kingdom. It applies particularly to the governmental realms of foreign, defense, security, and intelligence policy, but it also captures a broader sense that both public and private relations between the United States and Britain are particularly deep and close. The Special Relationship is thus a term for a reality that came into being over time as the result of political leadership as well as ideas and events outside the formal arena of politics.
After the political break of the American Revolution and in spite of sporadic cooperation in the 19th century, it was not until the Great Rapprochement of the 1890s that the idea that Britain and the United States had a special kind of relationship took hold. This decade, in turn, created the basis for the Special Relationship, a term first used by Winston Churchill in 1944. Churchill did the most to build the relationship, convinced as he was that close friendship between Britain and the United States was the cornerstone of world peace and prosperity. During and after the Second World War, many others on both sides of the Atlantic came to agree with Churchill.
The post-1945 era witnessed a flowering of the relationship, which was cemented—not without many controversies and crises—by the emerging Cold War against the Soviet Union. After the end of the Cold War in 1989, the relationship remained close, though it was severely tested by further security crises, Britain’s declining defense spending, the evolving implications of Britain’s membership in the European Union, the relative decline of Europe, and an increasing U.S. interest in Asia. Yet on many public and private levels, relations between the United States and Britain continue to be particularly deep, and thus the Special Relationship endures.
Charles M. Payne
The only youth-led national civil rights organization in the 1960s in the United States, the Student Nonviolent Coordinating Committee (SNCC), grew out of sit-ins, with the base of its early membership coming from Black colleges. It became one of the most militant civil rights groups, pushing older organizations to become more aggressive. Under the tutelage of the experienced activist Ella Baker, it emphasized developing leadership in “ordinary” people. Its early years were dominated by direct action campaigns against White supremacy in the urban and Upper South, while internally, SNCC strove to actualize the Beloved Community. Later it specialized in grassroots community organizing and voter registration in dangerous areas of the Deep South. Its Freedom Summer campaign played a significant role in radicalizing young activists. SNCC, in general, acted as a training ground and model for other forms of youth activism. Notwithstanding its own issues with chauvinism, SNCC was open to leadership from women in a way that few social change organizations of the time were.
Paul D. Miller
Afghanistan has twice been thrust front and center of US national security concerns in the past half-century: first, during the Soviet-Afghan War, when Afghanistan served as a proxy for American efforts to combat Soviet influence; and second, as the frontline state and host for America’s global response to al-Qaida’s terrorist attacks of 2001. In both instances, American involvement swung from intensive investment and engagement to withdrawal and neglect. In both cases, American involvement reflected US concerns more than Afghan realities. And both episodes resulted in short-term successes for American security with long-term consequences for Afghanistan and its people. The signing of a strategic partnership agreement between the two countries in 2012 and a bilateral security agreement in 2013 created the possibility of a steadier and more forward-looking relationship—albeit one that the American and Afghan people may be less inclined to pursue as America’s longest war continues to grind on.
Don H. Doyle
America’s Civil War became part of a much larger international crisis as European powers, happy to see the experiment in self-government fail in America’s “Great Republic,” took advantage of the situation to reclaim former colonies in the Caribbean and establish a European monarchy in Mexico. Overseas, in addition to their formal diplomatic appeals to European governments, both sides also experimented with public diplomacy campaigns to influence public opinion. Confederate foreign policy sought to win recognition and aid from Europe by offering free trade in cotton and aligning their cause with that of the aristocratic anti-democratic governing classes of Europe. The Union, instead, appealed to liberal, republican sentiment abroad by depicting the war as a trial of democratic government and embracing emancipation of the slaves. The Union victory led to the withdrawal of European empires from the New World: Spain from Santo Domingo, France from Mexico, Russia from Alaska, and Britain from Canada, and the destruction of slavery in the United States hastened its end in Puerto Rico, Cuba, and Brazil.
Francis D. Cogliano
Thomas Jefferson was a key architect of early American foreign policy. He had a clear vision of the place of the new republic in the world, which he articulated in a number of writings and state papers. The key elements to his strategic vision were geographic expansion and free trade. Throughout his long public career Jefferson sought to realize these ends, particularly during his time as US minister to France, secretary of state, vice president, and president. He believed that the United States should expand westward and that its citizens should be free to trade globally. He sought to maintain the right of the United States to trade freely during the wars arising from the French Revolution and its aftermath. This led to his greatest achievement, the Louisiana Purchase, but also to conflicts with the Barbary States and, ultimately, Great Britain. He believed that the United States should usher in a new world of republican diplomacy and that it would be in the vanguard of the global republican movement. In the literature on US foreign policy, historians have tended to identify two main schools of practice dividing practitioners into idealists and realists. Jefferson is often regarded as the founder of the idealist tradition. This somewhat misreads him. While he pursued clear idealistic ends—a world dominated by republics freely trading with each other—he did so using a variety of methods including diplomacy, war, and economic coercion.