81-100 of 435 Results

Article

Company towns can be defined as communities dominated by a single company, typically focused on one industry. Beyond that very basic definition, company towns varied in their essentials. Some were purpose-built by companies, often in remote areas convenient to needed natural resources. There, workers were often required to live in company-owned housing as a condition of employment. Others began as small towns with privately owned housing, usually expanding alongside a growing hometown corporation. Residences were shoddy in some company towns. In others, company-built housing may have been excellent, with indoor plumbing and central heating, and located close to such amenities as schools, libraries, perhaps even theaters. Company towns played a key role in US economic and social development. Such places can be found across the globe, but America’s vast expanse of undeveloped land, generous stock of natural resources, tradition of social experimentation, and laissez-faire attitude toward business provided singular opportunities for the emergence of such towns, large and small, in many regions of the United States. Historians have identified as many as 2,500 such places. A tour of company towns can serve as a survey of the country’s industrial development, from the first large-scale planned industrial community—the textile town of Lowell, Massachusetts—to Appalachian mining villages, Western lumber towns, and steelmaking principalities such as the mammoth development at Gary, Indiana. More recent office-park and high-tech industrial-park complexes probably do not qualify as company towns, although they have some similar attributes. Nor do such planned towns as Disney Corporation’s Celebration, Florida, qualify, despite close ties to a single corporation, because its residents do not necessarily work for Disney. Company towns have generally tended toward one of two models. First, and perhaps most familiar, are total institutions—communities where one business exerts a Big Brother–ish grip over the population, controlling or even taking the place of government, collecting rent on company-owned housing, dictating buying habits (possibly at the company store), and even directing where people worship and how they may spend their leisure time. A second form consists of model towns—planned, ideal communities backed by companies that promised to share their bounty with workers and families. Several such places were carefully put together by experienced architects and urban planners. Such model company towns were marked by a paternalistic, watchful attitude toward the citizenry on the part of the company overlords.

Article

Lise Namikas

At the dawn of the 20th century, the region that would become the Democratic Republic of Congo fell to the brutal colonialism of Belgium’s King Leopold. Except for a brief moment when anti-imperialists decried the crimes of plantation slavery, the United States paid little attention to Congo before 1960. But after winning its independence from Belgium in June 1960, Congo suddenly became engulfed in a crisis of decolonization and the Cold War, a time when the United States and the Soviet Union competed for resources and influence. The confrontation in Congo was kept limited by a United Nations (UN) peacekeeping force, which ended the secession of the province of Katanga in 1964. At the same time, the CIA (Central Intelligence Agency) intervened to help create a pro-Western government and eliminate the Congo’s first prime minister, Patrice Lumumba. Ironically, the result would be a growing reliance on the dictatorship of Joseph Mobutu throughout the 1980s. In 1997 a rebellion succeeded in toppling Mobutu from power. Since 2001 President Joseph Kabila has ruled Congo. The United States has supported long-term social and economic growth but has kept its distance while watching Kabila fight internal opponents and insurgents in the east. A UN peacekeeping force returned to Congo and helped limit unrest. Despite serving out two full terms that ended in 2016, Kabila was slow to call elections amid rising turmoil.

Article

Foreign relations under the US Constitution starts with the paradox, also seen in domestic matters, of relatively scant text providing guidance for the exercise of vast power. Founding understandings, structural inference, and ongoing constitutional custom and precedent have filled in much, though hardly all, of the framework over the course of two hundred years. As a result, two basic questions frame the relationship between the Constitution and US foreign policy: (1) which parts of the US government, alone or in combination, properly exercise authority in the making of foreign policy; and (2) once made, what is the status of the nation’s international legal obligations in the US domestic legal system. The making of American foreign policy is framed by the Constitution’s commitment to separation of powers. Congress, the president, and the courts are all allocated discrete yet significant foreign affairs authority. Determining the exact borders and overlaps in areas such as the use of military force, emergency measures, and treaty termination continues to generate controversy. The status of international law in the US legal system in the first instance turns on whether resulting obligations derive from agreements or custom. The United States enters into international agreements in three ways: treaties, congressional-executive agreements, and sole executive agreements. Complex doctrine deals with the domestic applicability of treaties in particular. US courts primarily apply customary international law in two basic ways. They can exercise a version of their common lawmaking authority to fashion rules of decision based on international custom. They also apply customary international law when incorporated into domestic law by statute.

Article

Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.

Article

The United States often views itself as a nation of immigrants. This may in part be why since the early 20th century the country has seldom adopted major changes in its immigration policy. Until 1986, only the 1924 National Origins Quota Act, its dismantlement in the 1952 McCarran-Walter Act, and the 1965 Immigration and Nationality Act, also known as the Hart-Celler Act, involved far-reaching reforms. Another large shift occurred with the passage of the 1986 Immigration Reform and Control Act (IRCA) and its derivative sequel, the 1990 Immigration Act. No major immigration legislation has yet won congressional approval in the 21st century. IRCA emerged from and followed in considerable measure the recommendations of the Select Commission on Immigration and Refugee Policy (1979–1981). That body sought to reconcile two competing political constituencies, one favoring the restriction of immigration, or at least unauthorized immigration, and the other an expansion of family-based and work-related migration. The IRCA legislation contained something for each side: the passage of employer sanctions, or serious penalties on employers for hiring unauthorized workers, for the restriction side; and the provision of a legalization program, which outlined a pathway for certain unauthorized entrants to obtain green cards and eventually citizenship, for the reform side. The complete legislative package also included other provisions: including criteria allowing the admission of agricultural workers, a measure providing financial assistance to states for the costs they would incur from migrants legalizing, a requirement that states develop ways to verify that migrants were eligible for welfare benefits, and a provision providing substantial boosts in funding for border enforcement activities. In the years after the enactment of IRCA, research has revealed that the two major compromise provisions, together with the agricultural workers provision, generated mixed results. Employer sanctions failed to curtail unauthorized migration much, in all likelihood because of minimal funding for enforcement, while legalization and the agricultural measures resulted in widespread enrollment, with almost all of the unauthorized migrants who qualified coming forward to take advantage of the opportunity to become U.S. legalized permanent residents (LPRs). But when the agricultural workers provisions allowing entry of temporary workers are juxtaposed with the relatively unenforceable employer-sanctions provisions, IRCA entailed contradictory elements that created frustration for some observers. In sociocultural, political, and historical terms, scholars and others can interpret IRCA’s legalization as reflecting the inclusive, pluralistic, and expansionist tendencies characteristic of much of 18th-century U.S. immigration. But some of IRCA’s other elements led to contradictory effects, with restriction efforts being offset by the allowances for more temporary workers. This helped to spawn subsequent political pressures in favor of new restrictive or exclusive immigration controls that created serious hazards for immigrants.

Article

In May 1861, three enslaved men who were determined not to be separated from their families ran to Fort Monroe, Virginia. Their flight led to the phenomenon of Civil War contraband camps. Contraband camps were refugee camps to which between four hundred thousand and five hundred thousand enslaved men, women, and children in the Union-occupied portions of the Confederacy fled to escape their owners by getting themselves to the Union Army. Army personnel had not envisioned overseeing a massive network of refugee camps. Responding to the interplay between the actions of the former slaves who fled to the camps, Republican legislation and policy, military orders, and real conditions on the ground, the army improvised. In the contraband camps, former slaves endured overcrowding, food and clothing shortages, poor sanitary conditions, and constant danger. They also gained the protection of the Union Army and access to the power of the US government as new, though unsteady, allies in the pursuit of their key interests, including education, employment, and the reconstitution of family, kin, and social life. The camps brought together actors who had previously had little to no contact with each other, exposed everyone involved to massive structural forces that were much larger than the human ability to control them, and led to unexpected outcomes. They produced a refugee crisis on US soil, affected the course and outcome of the Civil War, influenced the progress of wartime emancipation, and altered the relationship between the individual and the national government. Contraband camps were simultaneously humanitarian crises and incubators for a new relationship between African Americans and the US government.

Article

American history is replete with instances of counterinsurgency. An unsurprising reality considering the United States has always participated in empire building, thus the need to pacify resistance to expansion. For much of its existence, the U.S. has relied on its Army to pacify insurgents. While the U.S. Army used traditional military formations and use of technology to battle peer enemies, the same strategy did not succeed against opponents who relied on speed and surprise. Indeed, in several instances, insurgents sought to fight the U.S. Army on terms that rendered superior manpower and technology irrelevant. By introducing counterinsurgency as a strategy, the U.S. Army attempted to identify and neutralize insurgents and the infrastructure that supported them. Discussions of counterinsurgency include complex terms, thus readers are provided with simplified, yet accurate definitions and explanations. Moreover, understanding the relevant terms provided continuity between conflicts. While certain counterinsurgency measures worked during the American Civil War, the Indian Wars, and in the Philippines, the concept failed during the Vietnam War. The complexities of counterinsurgency require readers to familiarize themselves with its history, relevant scholarship, and terminology—in particular, counterinsurgency, pacification, and infrastructure.

Article

Andrew Frank

The Creek Confederacy was a loose coalition of ethnically and linguistically diverse Native American towns that slowly coalesced as a political entity in the 18th and early 19th centuries. Its towns existed in Georgia, Alabama, and northern Florida, and for most of its preremoval history, these towns operated as autonomous entities. Several Creek leaders tried to consolidate power and create a more centralized polity, but these attempts at nation building largely failed. Instead, a fragile and informal confederacy connected the towns together for various cultural rituals as well as for purposes of diplomacy and trade. Disputes over centralization, as well as a host of other connected issues, ultimately led to the Creek War of 1813–1814. In the 1830s, the United States forced most members of the Creek Confederacy to vacate their eastern lands and relocate their nation to Indian Territory. Today, their western descendants are known as the Muskogee (Creek) Nation. Those who remained in the east include members of the federally recognized Seminole Tribe of Florida and the Poarch Band of Creek Indians who live in Alabama.

Article

Michael J. Bustamante

The Cuban Revolution transformed the largest island nation of the Caribbean into a flashpoint of the Cold War. After overthrowing US-backed ruler Fulgencio Batista in early 1959, Fidel Castro established a socialist, anti-imperialist government that defied the island’s history as a dependent and dependable ally of the United States. But the Cuban Revolution is not only significant for its challenge to US interests and foreign policy prerogatives. For Cubans, it fundamentally reordered their lives, inspiring multitudes yet also driving thousands of others to migrate to Miami and other points north. Sixty years later, Fidel Castro may be dead and the Soviet Union may be long gone. Cuban socialism has become more hybrid in economic structure, and in 2014 the Cuban and US governments moved to restore diplomatic ties. But Cuba’s leaders continue to insist that “the Revolution,” far from a terminal political event, is still alive. Today, as the founding generation of Cuban leaders passes from the scene, “the Revolution” faces another important crossroads of uncertainty and reform.

Article

Distinctive patterns of daily life defined the Jim Crow South. Contrary to many observers’ emphasis on de jure segregation—meaning racial separation demanded by law—neither law nor the physical separation of blacks and whites was at the center of the early 20th-century South’s social system. Instead, separation, whether by law or custom, was one of multiple tools whites used to subordinate and exclude blacks and to maintain notions of white racial purity. In turn, these notions themselves varied over time and across jurisdictions, at least in their details, as elites tried repeatedly to establish who was “white,” who was “black,” and how the legal fictions they created would apply to Native Americans and others who fit neither category. Within this complex multiracial world of the South, whites’ fundamental commitment to keeping blacks “in their place” manifested most routinely in day-to-day social dramas, often described in terms of racial “etiquette.” The black “place” in question was socially but not always physically distant from whites, and the increasing number of separate, racially marked spaces and actual Jim Crow laws was a development over time that became most pronounced in urban areas. It was a development that reveals blacks’ determination to resist racial oppression and whites’ perceived need to shore up a supposedly natural order that had, in fact, always been enforced by violence as well as political and economic power. Black resistance took many forms, from individual, covert acts of defiance to organized political movements. Whether in response to African Americans’ continued efforts to vote or their early 20th-century boycotts of segregated streetcars or World War I-era patterns of migration that threatened to deplete the agricultural labor force, whites found ways to counter blacks’ demands for equal citizenship and economic opportunity whenever and wherever they appeared. In the rural South, where the majority of black Southerners remained economically dependent on white landowners, a “culture of personalism” characterized daily life within a paternalistic model of white supremacy that was markedly different from urban—and largely national, not merely southern—racial patterns. Thus, distinctions between rural and urban areas and issues of age and gender are critical to understanding the Jim Crow South. Although schools were rigorously segregated, preadolescent children could be allowed greater interracial intimacy in less official settings. Puberty became a break point after which close contact, especially between black males and white females, was prohibited. All told, Jim Crow was an inconsistent and uneven system of racial distinction and separation whose great reach shaped the South’s landscape and the lives of all Southerners, including those who were neither black nor white.

Article

Frederick Rowe Davis

The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.

Article

Death is universal yet is experienced in culturally specific ways. Because of this, when individuals in colonial North America encountered others from different cultural backgrounds, they were curious about how unfamiliar mortuary practices resembled and differed from their own. This curiosity spawned communication across cultural boundaries. The resulting knowledge sometimes facilitated peaceful relations between groups, while at other times it helped one group dominate another. Colonial North Americans endured disastrously high mortality rates caused by disease, warfare, and labor exploitation. At the same time, death was central to the religions of all residents: Indians, Africans, and Europeans. Deathways thus offer an unmatched way to understand the colonial encounter from the participants’ perspectives.

Article

The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded. Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.

Article

The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.

Article

Will Rogers understood the confounding nature of the Democratic Party. In noting that “Democrats never agree on anything, that’s why they’re Democrats,” the Oklahoma humorist highlighted a consistent theme in the party’s more than 200-year history: division. The political party of the underdog and ethnic, racial, and social minorities has always lacked the cultural cohesion that the Federalists, Whigs, and Republicans possessed. As a result, the main currents of Democratic Party foreign policy elude simple categorization. Muddying any efforts at classification are the dramatically disparate eras in which Democrats conducted foreign policy over two centuries. Like other major American political parties, the Democrats’ foreign policy was animated by a messianic theme balanced against the national and constituent interests. Thinking themselves a “chosen people,” the Revolutionary generation thought their experiment foreshadowed a new global order with universal appeal. As representatives of God’s new Israel, the Founders made their new nation’s messianic relationship to the international system essential to its identity. Shunning established foreign policy practices, they founded a style of American diplomacy that combined idealism with pragmatism. Democrats, along with most every other major political party, have followed the Founders’ example but in a manner particular to the party’s history, constituents, and circumstance. The foreign policy connective tissue of the Democratic Party has been its particular expression of the Founders’ messianic mission interpreted through its ever-evolving cast of disparate constituent groups. In pursuit of this, 19th-century Democratic foreign policy favored territorial and commercial expansion to safeguard their republican experiment. In the 20th and 21st century Democrats globalized these sentiments and sought a world conducive to democracy’s survival. But consistency is scarcely the hallmark of Democratic foreign policy. Driven by its disparate constituent groups and domestic politics, the party has bandied diverse foreign policy strategies through an array of historical circumstances. The sum total of Democratic foreign policy is, at times, a contradictory amalgam of diverse constituencies responding to the issues of the moment in a combination of self-interest and democratic idealism.

Article

Peter Cole

The history of dockworkers in America is as fascinating and important as it is unfamiliar. Those who worked along the shore loading and unloading ships played an invaluable role in an industry central to both the U.S. and global economies as well as the making of the nation. For centuries, their work remained largely the same, involving brute manual labor in gangs; starting in the 1960s, however, their work was entirely remade due to technological transformation. Dockworkers possess a long history of militancy, resulting in dramatic improvements in their economic and workplace conditions. Today, nearly all are unionists, but dockworkers in ports along the Atlantic and Gulf coasts belong to the International Longshoremen’s Association (ILA), while the International Longshore and Warehouse Union (ILWU) represents them in Pacific Coast ports as well as in Hawaii and Alaska (along with British Columbia and Panama). In the mid-1930s, the ILA and ILWU became bitter rivals and remain so. This feud, which has cooled slightly since its outset, can be explained by differences in leadership, ideology, and tactics, with the ILA more craft-based, “patriotic,” and mainstream and the ILWU quite left wing, especially during its first few decades, and committed to fighting for racial equality. The existence of two unions complicates this story; in most countries, dockworkers belong to a single union. Similarly, America’s massive economy and physical size means that there are literally dozens of ports (again, unlike many other countries), making generalizations harder. Unfortunately, popular culture depictions of dockworkers inculcate unfair and incorrect notions that all dockworkers are involved with organized crime. Nevertheless, due to decades of militancy, strikes, and unionism, dockworkers in 21st-century America are—while far fewer in number—very well paid and still do important work, literally making world trade possible in an era when 90 percent of goods move by ship for at least part of their journey to market.

Article

Domestic work was, until 1940, the largest category of women’s paid labor. Despite the number of women who performed domestic labor for pay, the wages and working conditions were often poor. Workers labored long hours for low pay and were largely left out of state labor regulations. The association of domestic work with women’s traditional household labor, defined as a “labor of love” rather than as real work, and its centrality to southern slavery, have contributed to its low status. As a result, domestic work has long been structured by class, racial, and gendered hierarchies. Nevertheless, domestic workers have time and again done their best to resist these conditions. Although traditional collective bargaining techniques did not always translate to the domestic labor market, workers found various collective and individual methods to insist on higher wages and demand occupational respect, ranging from quitting to “pan-toting” to forming unions.

Article

The use of illicit drugs in US cities led to the development of important subcultures with shared practices, codes, discourses, and values. From the 19th century onward, American city dwellers have indulged in opiates, cocaine, amphetamines, cannabis, lysergic acid diethylamide (LSD), crack, and 3,4-Methylenedioxymethamphetamine (also known as MDMA or ecstasy). The population density of metropolitan America contributed to the spread of substance use and the rise of communities that centered their lives on drug consumption. In the history of urban drug use, opiates have outlasted all the other drugs and have naturally attracted the bulk of scholarly attention. The nature and identity of these illicit subcultures usually depended on the pharmacology of the drugs and the setting in which they were used. Addictive substances like heroin and amphetamines certainly led to the rise of crime in certain urban areas, but by the same token many urban Americans managed to integrate their addiction into their everyday lives. The more complex pharmacology of psychedelic drugs like LSD in turn gave birth to rich subcultures that resist easy classifications. Most drugs began their careers as medical marvels that were accepted as the product of modernity and often used by the middle class or medical practitioners. Race, age, and class prejudice, and the association of drugs with visible subcultures perceived to pose a threat to the moral fabric of society can partly explain their subsequent bans.

Article

Probably no American president was more thoroughly versed in matters of national security and foreign policy before entering office than Dwight David Eisenhower. As a young military officer, Eisenhower served stateside in World War I and then in Panama and the Philippines in the interwar years. On assignments in Washington and Manila, he worked on war plans, gaining an understanding that national security entailed economic and psychological factors in addition to manpower and materiel. In World War II, he commanded Allied forces in the European Theatre of Operations and honed his skills in coalition building and diplomacy. After the war, he oversaw the German occupation and then became Army Chief of Staff as the nation hastily demobilized. At the onset of the Cold War, Eisenhower embraced President Harry S. Truman’s containment doctrine and participated in the discussions leading to the 1947 National Security Act establishing the Central Intelligence Agency, the National Security Council, and the Department of Defense. After briefly retiring from the military, Eisenhower twice returned to public service at the behest of President Truman to assume the temporary chairmanship of the Joint Chiefs of Staff and then, following the outbreak of the Korean War, to become the first Supreme Allied Commander, Europe, charged with transforming the North Atlantic Treaty Organization into a viable military force. These experiences colored Eisenhower’s foreign policy views, which in turn led him to seek the presidency. He viewed the Cold War as a long-term proposition and worried that Truman’s military buildup would overtax finite American resources. He sought a coherent strategic concept that would be sustainable over the long haul without adversely affecting the free enterprise system and American democratic institutions. He also worried that Republican Party leaders were dangerously insular. As president, his New Look policy pursued a cost-effective strategy of containment by means of increased reliance on nuclear forces over more expensive conventional ones, sustained existing regional alliances and developed new ones, sought an orderly process of decolonization under Western guidance, resorted to covert operations to safeguard vital interests, and employed psychological warfare in the battle with communism for world opinion, particularly in the so-called Third World. His foreign policy laid the basis for what would become the overall American strategy for the duration of the Cold War. The legacy of that policy, however, was decidedly mixed. Eisenhower avoided the disaster of global war, but technological innovations did not produce the fiscal savings that he had envisioned. The NATO alliance expanded and mostly stood firm, but other alliances were more problematic. Decolonization rarely proceeded as smoothly as envisioned and caused conflict with European allies. Covert operations had long-term negative consequences. In Southeast Asia and Cuba, the Eisenhower administration’s policies bequeathed a poisoned chalice for succeeding administrations.

Article

From 1775 to 1815, empire served as the most pressing foreign relationship problem for the United States. Would the new nation successfully break free from the British Empire? What would an American empire look like? How would it be treated by other empires? And could Americans hold their own against European superpowers? These questions dominated the United States’ first few decades of existence and shaped its interactions with American Indian, Haitian, Spanish, British, and French peoples. The US government—first the Continental Congress, then the Confederation Congress, and finally the federal administration under the new Constitution—grappled with five key issues. First, they sought international recognition of their independence and negotiated trade deals during the Revolutionary War to support the war effort. Second, they obtained access to the Mississippi River and Port of New Orleans from Spain and France to facilitate trade and western settlement. Third, they grappled with ongoing conflict with Indian nations over white settlement on Indian lands and demands from white communities for border security. Fourth, they defined and protected American neutrality, negotiated a trade policy that required European recognition of American independence, and denied recognition to Haiti. Lastly, they fought a quasi-war with France and real war with Great Britain in 1812.