101-120 of 130 Results  for:

  • Foreign Relations and Foreign Policy x
Clear all

Article

Thomas Jefferson was a key architect of early American foreign policy. He had a clear vision of the place of the new republic in the world, which he articulated in a number of writings and state papers. The key elements to his strategic vision were geographic expansion and free trade. Throughout his long public career Jefferson sought to realize these ends, particularly during his time as US minister to France, secretary of state, vice president, and president. He believed that the United States should expand westward and that its citizens should be free to trade globally. He sought to maintain the right of the United States to trade freely during the wars arising from the French Revolution and its aftermath. This led to his greatest achievement, the Louisiana Purchase, but also to conflicts with the Barbary States and, ultimately, Great Britain. He believed that the United States should usher in a new world of republican diplomacy and that it would be in the vanguard of the global republican movement. In the literature on US foreign policy, historians have tended to identify two main schools of practice dividing practitioners into idealists and realists. Jefferson is often regarded as the founder of the idealist tradition. This somewhat misreads him. While he pursued clear idealistic ends—a world dominated by republics freely trading with each other—he did so using a variety of methods including diplomacy, war, and economic coercion.

Article

Tourism is so deep-seated in the history of U.S. foreign relations we seem to have taken its presence for granted. Millions of American tourists have traveled abroad, yet one can count with just two hands the number of scholarly monographs analyzing the relationship between U.S. foreign relations and tourism. What explains this lack of historical reflection about one of the most quotidian forms of U.S. influence abroad? In an influential essay about wilderness and the American frontier, the environmental historian William Cronon argues, “one of the most striking proofs of the cultural invention of wilderness is its thoroughgoing erasure of the history from which it sprang.” Historians and the American public, perhaps in modern fashion, have overlooked tourism’s role in the nation’s international affairs. Only a culture and a people so intimately familiar with tourism’s practices could naturalize them out of history. The history of international tourism is profoundly entangled with the history of U.S. foreign policy. This entanglement has involved, among other things, science and technology, military intervention, diplomacy, and the promotion of consumer spending abroad. U.S. expansion created the structure (the social stability, medical safety, and transportation infrastructure) for globetrotting travel in the 20th century. As this essay shows, U.S. foreign policy was crucial in transforming foreign travel into a middle-class consumer experience.

Article

Barin Kayaoğlu

Since the 1780s, the geographical, historical, cultural, and ideational chasm between Turkey and the United States has remained wide. The presence of US merchants, missionaries, and educators in Ottoman territories, the immigration of hundreds of thousands of Ottoman subjects to the United States, the tens of thousands of US military personnel stationed in Turkey during the Cold War, and the thousands of Turkish citizens who continue to attend US universities every year have not been able to bridge that gap. Aside from the cultural and geographical gap, much of the disconnect between Turkey and the United States came from Turkish and American leaders’ ignorance of the other side. From the 19th century onward, Ottoman and Turkish leaders hoped to use the United States as a counterweight in Europe’s great power games despite Washington’s lack of interest in such an outlook until 1945. Likewise, most US administrations after 1945 thought that Turkey’s national interests could be easily reconciled with US global security priorities.

Article

Brian McNeil

The United States and Nigeria have a long history, stretching back to the transatlantic slave trade in the 18th century and continuing today through economic and security partnerships. While the relationship has evolved over time and both countries have helped to shape each other’s histories in important ways, there remains a tension between hope and reality in which both sides struggle to live up to the expectations set for themselves and for each other. The United States looks to Nigeria to be the model of progress and stability in Africa that the West African state wants to become; Nigeria looks to American support for its development and security needs despite the United States continuously coming up short. There have been many strains in the relationship, and the United States and Nigeria have continued to ebb and flow between cooperation and conflict. Whatever friction there might be, the relationship between the United States and Nigeria is important to analyze because it offers a window to understanding trends and broad currents in international history such as decolonization, humanitarianism, energy politics, and terrorism.

Article

The Special Relationship is a term used to describe the close relations between the United States and the United Kingdom. It applies particularly to the governmental realms of foreign, defense, security, and intelligence policy, but it also captures a broader sense that both public and private relations between the United States and Britain are particularly deep and close. The Special Relationship is thus a term for a reality that came into being over time as the result of political leadership as well as ideas and events outside the formal arena of politics. After the political break of the American Revolution and in spite of sporadic cooperation in the 19th century, it was not until the Great Rapprochement of the 1890s that the idea that Britain and the United States had a special kind of relationship took hold. This decade, in turn, created the basis for the Special Relationship, a term first used by Winston Churchill in 1944. Churchill did the most to build the relationship, convinced as he was that close friendship between Britain and the United States was the cornerstone of world peace and prosperity. During and after the Second World War, many others on both sides of the Atlantic came to agree with Churchill. The post-1945 era witnessed a flowering of the relationship, which was cemented—not without many controversies and crises—by the emerging Cold War against the Soviet Union. After the end of the Cold War in 1989, the relationship remained close, though it was severely tested by further security crises, Britain’s declining defense spending, the evolving implications of Britain’s membership in the European Union, the relative decline of Europe, and an increasing U.S. interest in Asia. Yet on many public and private levels, relations between the United States and Britain continue to be particularly deep, and thus the Special Relationship endures.

Article

The United States was heavily involved in creating the United Nations in 1945 and drafting its charter. The United States continued to exert substantial clout in the organization after its founding, though there have been periods during which U.S. officials have met with significant opposition inside the United Nations, in Congress, and in American electoral politics, all of which produced struggles to gain support for America’s international policy goals. U.S. influence in the international organization has thus waxed and waned. The early postwar years witnessed the zenith of American prestige on the global stage. Starting in the mid- to late 1950s, as decolonization and the establishment of newly independent nations quickened, the United States began to lose influence in the United Nations owing to the spreading perception that its alliances with the European colonial powers placed it on the wrong side of history. As U.N. membership skyrocketed, the organization became more responsive to the needs and interests of the decolonizing states. During the 1970s and early 1980s, the American public responded to declining U.S. influence in the United Nations with calls to defund the organization and to pursue a unilateral approach to international challenges. The role of the United States in the United Nations was shaped by the politics of the Cold War competition with the Soviet Union. Throughout the nearly five decades of the Cold War, the United Nations served as a forum for the political and ideological rivalry between the United States and the Soviet Union, which frequently inhibited the organization from fulfilling what most considered to be its primary mission: the maintenance of global security and stability. After the collapse of the Soviet Union and the peaceful end of the Cold War, the United States enjoyed a brief period of unrivaled global hegemony. During this period, U.S. officials pursued a closer relationship with the United Nations and sought to use the organization to build support for its international policy agenda and military interventionism.

Article

The U.S. relationship with Southeast Asia has always reflected the state of U.S. interactions with the three major powers that surround the region: Japan, China, and, to a lesser extent, India. Initially, Americans looked at Southeast Asia as an avenue to the rich markets that China and India seemed to offer, while also finding trading opportunities in the region itself. Later, American missionaries sought to save Southeast Asian souls, while U.S. officials often viewed Southeast Asia as a region that could tip the overall balance of power in East Asia if its enormous resources fell under the control of a hostile power. American interest expanded enormously with the annexation of the Philippines in 1899, an outgrowth of the Spanish-American War. That acquisition resulted in a nearly half-century of American colonial rule, while American investors increased their involvement in exploiting the region’s raw materials, notably tin, rubber, and petroleum, and missionaries expanded into areas previously closed to them. American occupation of the Philippines heightened tensions with Japan, which sought the resources of Southeast Asia, particularly in French Indochina, Malaya, and the Dutch East Indies (today’s Indonesia). Eventually, clashing ambitions and perceptions brought the United States into World War II. Peeling those territories away from Japan during the war was a key American objective. Americans resisted the Japanese in the Philippines and in Burma, but after Japan quickly subdued Southeast Asia, there was little contact in the region until the reconquest began in 1944. American forces participated in the liberation of Burma and also fought in the Dutch Indies and the Philippines before the war ended in 1945. After the war, the United States had to face the independence struggles in several Southeast Asian countries, even as the Grand Alliance fell apart and the Cold War emerged, which for the next several decades overshadowed almost everything. American efforts to prevent communist expansion in the region inhibited American support for decolonization and led to war in Vietnam and Laos and covert interventions elsewhere. With the end of the Cold War in 1991, relations with most of Southeast Asia have generally been normal, except for Burma/Myanmar, where a brutal military junta ruled. The opposition, led by the charismatic Aung San Suu Kyi, found support in the United States. More recently American concerns with China’s new assertiveness, particularly in the South China Sea, have resulted in even closer U.S. relations with Southeast Asian countries.

Article

The United States never sought to build an empire in Africa in the 19th and 20th centuries, as did European nations from Britain to Portugal. However, economic, ideological, and cultural affinities gradually encouraged the development of relations with the southern third of the continent (the modern Anglophone nations of South Africa, Zimbabwe, Zambia, Namibia, the former Portuguese colonies of Mozambique and Angola, and a number of smaller states). With official ties limited for decades, missionaries and business concerns built a small but influential American presence mostly in the growing European settler states. This state of affairs made the United State an important trading partner during the 20th century, but it also reinforced the idea of a white Christian civilizing mission as justification for the domination of black peoples. The United States served as a comparison point for the construction of legal systems of racial segregation in southern Africa, even as it became more politically involved in the region as part of its ideological competition with the Soviet Union. As Europe’s empires dissolved after World War II, official ties to white settler states such as South Africa, Angola, and Rhodesia (modern Zimbabwe) brought the United States into conflict with mounting demands for decolonization, self-determination, and racial equality—both international and domestic. Southern Africa illustrated the gap between a Cold War strategy predicated on Euro-American preponderance and national traditions of liberty and democracy, eliciting protests from civil and human rights groups that culminated in the successful anti-apartheid movement of the 1980s. Though still a region of low priority at the beginning of the 21st century, American involvement in southern Africa evolved to emphasize the pursuit of social and economic improvement through democracy promotion, emergency relief, and health aid—albeit with mixed results. The history of U.S. relations with southern Africa therefore illustrates the transformation of trans-Atlantic racial ideologies and politics over the last 150 years, first in the construction of white supremacist governance and later in the eventual rejection of this model.

Article

In December 1979, Soviet troops entered the small, poor, landlocked, Islamic nation of Afghanistan, assassinated the communist president, Hafizullah Amin, and installed a more compliant Afghan leader. For almost ten years, Soviet troops remained entrenched in Afghanistan before finally withdrawing in February 1989. During this period, the United States undertook a covert program to assist the anti-communist Afghan insurgents—the mujahideen—to resist the Soviet occupation. Beginning with President Jimmy Carter’s small-scale authorization in July 1979, the secret war became the largest in history under President Ronald Reagan, running up to $700 million per year. The Central Intelligence Agency (CIA) acted as the war’s quartermaster, arranging supplies of weapons for the mujahideen, which were funneled through Pakistan’s Inter-Services Intelligence directorate (ISI), in coordination with Saudi Arabia, China, Egypt, and others. No Americans were directly involved in the fighting, and the overall cost to the American taxpayer was in the region of $2 billion. The Afghan cost was much higher. Over a million Afghans were killed, a further two million wounded, and over six million refugees fled to neighboring Pakistan and Iran. For the Soviet Union, the ten-year war constituted its largest military action in the postwar era, and the long and protracted nature of the conflict and the failure of the Red Army to subdue the Afghans is partially responsible for the internal turmoil that contributed to the eventual breakup of the Soviet empire at the end of the 1980s. The defeat of the Soviet 40th Army in Afghanistan proved to be the final major superpower battle of the Cold War, but it also marked the beginning of a new era. The devastation and radicalization of Afghan society resulted in the subsequent decades of continued conflict and warfare and the rise of militant Islamic fundamentalism that has shaped the post-Cold War world.

Article

The relationship between the United States and the island of Ireland combines nostalgic sentimentality and intervention in the sectarian conflict known as the “Troubles.” Irish migration to the United States remains a celebrated and vital part of the American saga, while Irish American interest—and involvement—in the “Troubles” during the second half of the 20th century was a problematic issue in transatlantic relations and for those seeking to establish a peaceful political consensus on the Irish question. Paradoxically, much of the historiography of American–Irish relations addresses the social, economic, and cultural consequences of the Irish in America, yet the major political issue—namely the United States’ approach to the “Troubles”—has only recently become subject of thorough historiographical inquiry. As much as the Irish have contributed to developments in American history, the American contribution to the Anglo-Irish process, and ultimate peace process, in order to end conflict in Northern Ireland is an example of the peacemaking potential of US foreign policy.

Article

With the outbreak of war in Europe, a growing fear of and ultimately a concerted effort to defeat Adolf Hitler and Nazi Germany defined American involvement. Competing Allied national and strategic interests resulted in serious debates, but the common desire to defeat the enemy proved stronger than any disagreements. President Franklin Roosevelt, understanding the isolationist sentiments of the American public and the dangers of Nazism and Imperial Japan perhaps better than most, carefully led the nation through the difficult period of 1939–1941, overseeing a gradual increase in American military preparedness and support for those standing up to Nazi Germany, as the German military forces achieved victory after victory. Following American entry into the war, strategic discussions in 1942–1943 often involved ambitious American military plans countered by British voices of moderation. The forces and supplies available made a direct invasion of northern France unfeasible. The American desire to launch an immediate invasion across the English Channel gave way to the Allied invasion of North Africa and subsequent assault on Sicily and the Italian peninsula. The Tehran Conference in November 1943 marked a transition, as the buildup of American forces in Europe and the overwhelming contribution of war materials enabled the United States to determine American-British strategy from late 1943 to the end of the war. The final year and a half of the war in Europe saw a major shift in strategic leadership, as the United States along with the Soviet Union assumed greater control over the final steps toward victory over Nazi Germany. By the end of World War II (May 1945 in Europe and September 1945 in Asia), the United States had not only assumed the leadership of the Western Allies, it had achieved superpower status with the greatest air force and navy in the world. It was also the sole possessor of the atomic bomb. Even with the tensions with the Soviet Union and beginnings of a Cold War, most Americans felt the United States was the leader as the world entered the post-war era.

Article

The Cold War may have ended on the evening of November 9, 1989, when East German border guards opened up checkpoints and allowed their fellow citizens to stream into West Berlin; it certainly was over by January 28, 1992, when U.S. president George H. W. Bush delivered his annual State of the Union Address one month after President Mikhail Gorbachev had announced his resignation and the end of the Soviet Union. After the Berlin Wall came down, Bush and Gorbachev spoke of the Cold War in the past tense in person and on the telephone. The reunification of Germany and U.S. military campaign in the Persian Gulf confirmed that reality. In January 1991, polls indicated that, for the first time, a majority of Americans believed that the Cold War was over. However, the poll results obscured the substantial foreign and domestic crises, challenges, and opportunities created by the end of the Cold War that occupied President Bush and his national-security team between November 1989 and Bush’s defeat in the 1992 presidential inauguration and the inauguration of William Jefferson Clinton as America’s first post–Cold War president in January 1993.

Article

The region that today constitutes the United States–Mexico borderland has evolved through various systems of occupation over thousands of years. Beginning in time immemorial, the land was used and inhabited by ancient peoples whose cultures we can only understand through the archeological record and the beliefs of their living descendants. Spain, then Mexico and the United States after it, attempted to control the borderlands but failed when confronted with indigenous power, at least until the late 19th century when American capital and police established firm dominance. Since then, borderland residents have often fiercely contested this supremacy at the local level, but the borderland has also, due to the primacy of business, expressed deep harmonies and cooperation between the U.S. and Mexican federal governments. It is a majority minority zone in the United States, populated largely by Mexican Americans. The border is both a porous membrane across which tremendous wealth passes and a territory of interdiction in which noncitizens and smugglers are subject to unusually concentrated police attention. All of this exists within a particularly harsh ecosystem characterized by extreme heat and scarce water.

Article

American strategy in the Asia Pacific over the past two centuries has been marked by strong and often contradictory impulses. On the one hand, the western Pacific has served as a fertile ground for Christian missionaries, an alluring destination for American commercial enterprises, and eventually a critical launchpad for U.S. global power projection. Yet on the other hand, American policymakers at times have subordinated Asian strategy to European-based interests, or have found themselves embroiled in area conflicts that have hampered efforts to extend U.S. regional hegemony. Furthermore, leading countries in the Asia-Pacific region at times have challenged U.S. economic and military objectives, and the assertion of “Asian values” in recent years has undermined efforts to expand Western political and cultural norms. The United States’s professed “pivot to Asia” has opened a new chapter in a centuries-long relationship, one that will determine the geopolitical fault lines of the 21st century.

Article

James F. Siekmeier

Throughout the 19th and 20th centuries, U.S. officials often viewed Bolivia as both a potential “test case” for U.S. economic foreign policy and a place where Washington’s broad visions for Latin America might be implemented relatively easily. After World War II, Washington leaders sought to show both Latin America and the nonindustrialized world that a relatively open economy could produce significant economic wealth for Bolivia’s working and middle classes, thus giving the United States a significant victory in the Cold War. Washington sought a Bolivia widely open to U.S. influence, and Bolivia often seemed an especially pliable country. In order to achieve their goals in Bolivia, U.S. leaders dispensed a large amount of economic assistance to Bolivia in the 1950s—a remarkable development in two senses. First, the U.S. government, generally loath to aid Third World nations, gave this assistance to a revolutionary regime. Second, the U.S. aid program for Bolivia proved to be a precursor to the Alliance for Progress, the massive aid program for Latin America in the 1960s that comprised the largest U.S. economic aid program in the Third World. Although U.S. leaders achieved their goal of a relatively stable, noncommunist Bolivia, the decision in the late 1950s to significantly increase U.S. military assistance to Bolivia’s relatively small military emboldened that military, which staged a coup in 1964, snuffing out democracy for nearly two decades. The country’s long history of dependency in both export markets and public- and private-sector capital investment led Washington leaders to think that dependency would translate into leverage over Bolivian policy. However, the historical record is mixed in this regard. Some Bolivian governments have accommodated U.S. demands; others have successfully resisted them.

Article

Following the Spanish-American War of 1898 and the illegal overthrow and annexation of Hawai‘i, the US government transplanted its colonial education program to places in the Caribbean and the Pacific Islands. Specifically, American Sāmoa, Guam, Hawai‘i, Puerto Rico, the Philippines, and the US Virgin Islands would all have some aspect of the native boarding school system implemented. In many ways, the colonial education system in Guam was emblematic and exceptional to native boarding schools in the continental United States. Utilizing Guam as a case study reveals how the US military used schools as a site to spread settler colonial policies in an attempt to transform Chamorros into colonial subjects who would support American occupation.

Article

Karim Elkady

Since the 1830s, Egyptian regimes have sought US governmental support to assist Egypt in gaining its independence and enable it to act freely in the region. Because historically the United States had no territorial interests in Egypt, Egyptian leaders solicited this strategic connection as potentially a leverage first against the Ottoman Empire, France, and England from the 1830s to World War I, then later against the British military occupation until 1954, and finally against Israel’s occupation of Sinai from 1967 to 1973. Egypt also courted US assistance to support its regional ambitions, to assume leadership of the Arab World, and to stabilize the Middle East. Later, the economic and financial challenges that Egypt has faced in its recent history have led it to request and rely on US military and economic aid. US interests in Egypt have shifted during their relationship. Initially the United States was interested in trade and protection of private US citizens, especially its Protestant missionaries. But after World War II and the rise of the United States to a position of global leadership, US motives changed. Due to US interests in Persian Gulf oil, its commitment to defend Israel, and its interest in protecting Egypt against the control of hostile powers, the United States became more invested in securing Egypt’s strategic location and utilizing its regional political weight. The United States became involved in securing Egypt from Axis invasions during World War II and in containing Soviet attempts to lock Egypt into an alliance with Moscow. After a period of tense relations from the 1950s to the early 1970s, Egypt and the United States reached a rapprochement in 1974. From that time on, the Egyptian–US strategic partnership emerged, especially after the Camp David Accords, to protect the region from the Soviet Union, the Islamic Republic of Iran, and Iraq under Saddam Hussein, and then to contain the rise of terrorism.

Article

Thomas P. Cavanna

In its most general sense, grand strategy can be defined as the overarching vision that shapes a state’s foreign policy and approach to national security. Like any strategy, it requires the coherent articulation of the state’s ends and means, which necessitates prioritizing vital interests, identifying key threats and opportunities, and (within certain limits) adapting to circumstances. What makes it truly “grand” is that it encompasses both wartime and peacetime, harnesses immediate realities to long-term objectives, and requires the coordination of all instruments of power (military, economic, etc.). Although American leaders have practiced grand strategic thinking since the early days of the Republic, the concept of grand strategy itself only started to emerge during World War I due to the expansion and diversification of the state’s resources and prerogatives, the advent of industrial warfare, and the growing role of populations in domestic politics and international conflicts. Moreover, it was only during World War II that it detached itself from military strategy and gained real currency among decision-makers. The contours, desirability, and very feasibility of grand strategy have inspired lively debates. However, many scholars and leaders consider it a worthy (albeit complex) endeavor that can reduce the risk of resource-squandering, signal intentions to both allies and enemies, facilitate adjustments to international upheavals, and establish a baseline for accountability. America’s grand strategy evolved from relative isolationism to full-blown liberal internationalism after 1945. Yet its conceptualization and implementation are inherently contentious processes because of political/bureaucratic infighting and recurrent dilemmas such as the uncertain geographic delimitation of US interests, the clash of ideals and Realpolitik, and the tension between unilateralism and multilateralism. The end of the Cold War, the 9/11 attacks, China’s rise, and other challenges have further compounded those lines of fracture.

Article

Robert McGreevey

U.S. imperialism took a variety of forms in the early 20th century, ranging from colonies in Puerto Rico and the Philippines to protectorates in Cuba, Panama, and other countries in Latin America, and open door policies such as that in China. Formal colonies would be ruled with U.S.-appointed colonial governors and supported by U.S. troops. Protectorates and open door policies promoted business expansion overseas through American oversight of foreign governments and, in the case of threats to economic and strategic interests, the deployment of U.S. marines. In all of these imperial forms, U.S. empire-building both reflected and shaped complex social, cultural, and political histories with ramifications for both foreign nations and America itself.

Article

Brandon Wolfe-Hunnicutt

Oil played a central role in shaping US policy toward Iraq over the course of the 20th century. The United States first became involved in Iraq in the 1920s as part of an effort secure a role for American companies in Iraq’s emerging oil industry. As a result of State Department efforts, American companies gained a 23.75 percent ownership share of the Iraq Petroleum Company in 1928. In the 1940s, US interest in the country increased as a result of the Cold War with the Soviet Union. To defend against a perceived Soviet threat to Middle East oil, the US supported British efforts to “secure” the region. After nationalist officers overthrew Iraq’s British-supported Hashemite monarchy in 1958 and established friendly relations with the Soviet Union, the United States cultivated an alliance with the Iraqi Baath Party as an alternative to the Soviet-backed regime. The effort to cultivate an alliance with the Baath foundered as a result the Baath’s perceived support for Arab claims against Israel. The breakdown of US-Baath relations led the Baath to forge an alliance with the Soviet Union. With Soviet support, the Baath nationalized the Iraq Petroleum Company in 1972. Rather than resulting in a “supply cutoff,” Soviet economic and technical assistance allowed for a rapid expansion of the Iraqi oil industry and an increase in Iraqi oil flowing to world markets. As Iraq experienced a dramatic oil boom in the 1970s, the United States looked to the country as a lucrative market for US exports goods and adopted a policy of accommodation with regard to Baath. This policy of accommodation gave rise to close strategic and military cooperation throughout the 1980s as Iraq waged war against Iran. When Iraq invaded Kuwait and seized control of its oil fields in 1990, the United States shifted to a policy of Iraqi containment. The United States organized an international coalition that quickly ejected Iraqi forces from Kuwait, but chose not to pursue regime change for fear of destabilizing the country and wider region. Throughout the 1990s, the United States adhered to a policy of Iraqi containment but came under increasing pressure to overthrow the Baath and dismantle its control over the Iraqi oil industry. In 2003, the United States seized upon the 9/11 terrorist attacks as an opportunity to implement this policy of regime change and oil reprivatization.