You are looking at 121-140 of 405 articles
Philippe R. Girard
Haiti (known as Saint-Domingue until it gained its independence from France in 1804) had a noted economic and political impact on the United States during the era of the American Revolution, when it forced U.S. statesmen to confront issues they had generally avoided, most prominently racism and slavery. But the impact of the Haitian Revolution was most tangible in areas like commerce, territorial expansion, and diplomacy. Saint-Domingue served as a staging ground for the French military and navy during the American Revolution and provided troops to the siege of Savannah in 1779. It became the United States’ second-largest commercial partner during the 1780s and 1790s. After Saint-Domingue’s slaves revolted in 1791, many of its inhabitants found refuge in the United States, most notably in Philadelphia, Charleston, and New Orleans. Fears (or hopes) that the slave revolt would spread to the United States were prevalent in public opinion. As Saint-Domingue achieved quasi-autonomous status under the leadership of Toussaint Louverture, it occupied a central place in the diplomacy of John Adams and Thomas Jefferson. The Louisiana Purchase was made possible in part by the failure of a French expedition to Saint-Domingue in 1802–1803. Bilateral trade declined after Saint-Domingue acquired its independence from France in 1804 (after which Saint-Domingue became known as Haiti), but Haiti continued to loom large in the African-American imagination, and there were several attempts to use Haiti as a haven for U.S. freedmen. The U.S. diplomatic recognition of Haiti also served as a reference point for antebellum debates on slavery, the slave trade, and the status of free people of color in the United States.
Sworn in as the 33rd President of the United States following Franklin D. Roosevelt’s death in April 1945, Harry S. Truman faced the daunting tasks of winning the war and ensuring future peace and stability. Chided by critics for his lack of foreign policy experience but championed by supporters for his straightforward decision-making, Truman guided the United States from World War to Cold War. The Truman presidency marked a new era in American foreign relations, with the United States emerging from World War II unmatched in economic strength and military power. The country assumed a leadership position in a postwar world primarily shaped by growing antagonism with the Soviet Union. Truman pursued an interventionist foreign policy that took measures to contain Soviet influence in Europe and stem the spread of communism in Asia. Under his leadership, the United States witnessed the dawn of the atomic age, approved billions of dollars in economic aid to rebuild Europe, supported the creation of multilateral organizations such as the United Nations and North Atlantic Treaty Organization, recognized the state of Israel, and intervened in the Korean peninsula. The challenges Truman confronted and the policies he implemented laid the foundation for 20th-century US foreign relations throughout the Cold War and beyond.
The Haymarket Riot and Conspiracy of 1886 is a landmark in American social and political history. On May 4, 1886, during an open-air meeting near Haymarket Square in Chicago, someone threw a dynamite bomb into a squad of police, sparking a riot that resulted in the deaths of seven police officers and at least four rioters. Eight anarchists were brought to trial. Though the bomb-thrower was never apprehended, the eight radical leaders were charged as accessories before the fact for conspiring to murder the police. After the longest criminal trial in Illinois history up to that time, seven men were convicted and condemned to death and one to a long prison term. After all appeals were exhausted, four were executed, one cheated the hangman with a jail cell suicide, and the death sentences of two others were commuted to life imprisonment (all three incarcerated men were later pardoned by Governor John Peter Altgeld in 1892).
The Haymarket bombing and trial marked a pivotal moment in the history of American social movements. It sparked the nation’s first red scare whose fury disrupted even moderately leftist movements for a generation. It drove the nation’s labor unions onto a more conservative path than they had been heading before the bombing. The worldwide labor campaign for clemency for the convicted men became the foundation for the institution of International Workers’ Day on May 1, a holiday ironically observed in most countries except for the United States. It also began a tradition within the American left of memorializing the Haymarket defendants as the first martyrs to their cause.
Thomas Alan Schwartz
Henry Kissinger was the most famous and most controversial American diplomat of the second half of the 20th century. Escaping Nazi persecution in the 1930s, serving in the American Army of occupation in Germany after 1945, and then pursuing a successful academic career at Harvard University, Kissinger had already achieved national prominence as a foreign policy analyst and defense intellectual when he was appointed national security adviser by President Richard Nixon in January 1969. Kissinger quickly became the president’s closest adviser on foreign affairs and worked with Nixon to change American foreign policy in response to domestic upheaval caused by the Vietnam War in the late 1960s and early 1970s. Nixon and Kissinger’s initiatives, primarily détente with the Soviet Union, the opening to the People’s Republic of China, and ending American involvement in the Vietnam War, received strong domestic support and helped to bring about Nixon’s re-election landslide in 1972. In the wake of the Watergate scandal, Nixon appointed Kissinger secretary of state in August 1973. As Nixon’s capacity to govern deteriorated, Kissinger assumed all-but presidential powers, even putting American forces on alert during the Yom Kippur war and then engaging in “shuttle diplomacy” in the Middle East, achieving the first-ever agreements between Israel and Egypt and Israel and Syria. Kissinger retained a dominating influence over foreign affairs during the presidency of Gerald Ford, even as he became a lightning rod for critics on both the left and right of the political spectrum. Although out of public office after 1977, Kissinger remained in the public eye as a foreign policy commentator, wrote three volumes of memoirs as well as other substantial books on diplomacy, and created a successful international business-consulting firm. His only governmental positions were as chair of the Commission on Central America in 1983–1984 and a brief moment on the 9/11 Commission in 2002.
The United States is a nation built on credit, both public and private. This article focuses on private credit: that is, credit extended to businesses and consumers by private entities such as banks, other businesses, and retail stores. Business credit involves short-term lending for items such as inventories, payroll, and the like; and long-term lending for the building of factories, offices, and other physical plant. Trade credit, bank loans, bonds, and commercial paper are all forms of business credit. Consumer credit is extended to individuals or households to fund purchases ranging from basic necessities to homes. Informal store credits, installment sales, personal loans from banks and other institutions, credit cards, home mortgages, and student loans are forms of consumer credit.
Until the 20th century, the federal government remained mostly uninvolved in the private credit markets. Then, after World War I and especially during the Great Depression, the government deliberately expanded the credit available for certain targeted groups, such as farmers and home buyers. After World War II the government helped to expand lending even further, this time to small businesses and students. Mostly the government accomplished its goal not through lending directly but by insuring the loans made by private entities, thereby encouraging them to make more loans. In the case of home mortgages and student loans, the government took the lead in creating a national market for securitized debt—debt that is turned into securities, such as bonds, and offered to investors—through the establishment of government-sponsored enterprises, nicknamed Fannie Mae (1938), Ginnie Mae (1968), Freddie Mac (1970), and Sallie Mae (1972). Innovations such as these by businesses and government made credit increasingly available to ordinary people, whose attitudes toward borrowing changed accordingly.
Since the 1880s, the US government has deported more than 55 million immigrants, the majority of whom came from Latin-American countries. But the history of immigrant deportations from the United States dates back further, as both colonial and state governments practiced expulsions. Many expulsions were not based on immigrant status, but rather integration or membership in a town or state. Citizens from the United States, for example, found themselves expelled from Massachusetts between the 1840s and 1870s under laws that targeted the migrant poor. In the 1880s, US federal authorities constructed the nation’s first deportation policy, building off earlier state expulsion policies. Early federal deportation policy reflected the racism and nativism of the era. In an expression of anti-Chinese racism, one of the very first deportation provisions passed by the federal government targeted Chinese immigrants. Other early federal deportation provisions included ones aimed at idiots, prostitutes, alcoholics, and public charges. The earliest federal deportation policy was narrow in scope, at least initially, in part because the laws held primarily that only people who entered the country in violation of an immigrant exclusion were deportable, and there were time limits that protected most long-term immigrants from deportation.
Beginning in the second decade of the 20th century, lawmakers slowly expanded deportation policy to make actions on US soil deportable offenses or for what has been called “post-entry infractions.” The newly created post-entry infractions included a small number of crimes and provisions that targeted political radicals. After the 1920s, immigration authorities focused their enforcement actions more on Mexican immigrants than on any other group under an expanding deportation policy. They did so on racial grounds, for racist reasons. The numbers of Mexicans deported increased with each passing decade, eventually reaching as many as a million people a year. Almost all immigrant deportations from the United States—more than 48 million—have taken place since 1965. In that year, the federal government entered the business of mass and constant deportations. As deportations multiplied, the proportion of Latin-American countries other than Mexico that received deported people also escalated. Although the majority of deportations in US history have been carried out for entering or remaining in the country in violation, major anti-crime campaigns in the last forty years have resulted in a growing number of deportations for post-entry infractions.
Timothy S. Huebner
The Supreme Court of the United States stands at the head of the nation’s judicial system. Created in Article III of the Constitution of 1787 but obscured by the other branches of government during the first few decades of its history, the Court came into its own as a co-equal branch in the early 19th century. Its exercise of judicial review—the power that it claimed to determine the constitutionality of legislative acts—gave the Court a unique status as the final arbiter of the nation’s constitutional conflicts. From the slavery question during the antebellum era to abortion and gay rights in more recent times, the Court has decided cases brought to it by individual litigants, and in doing so has shaped American constitutional and legal development. Composed of unelected justices who serve “during good behavior,” the Court’s rise in stature has not gone uncontested. Throughout the nation’s history, Congress, the president, and organized interest groups have all attempted to influence the Court’s jurisdiction, composition, and decision making. The Court’s prominence reflects Americans’ historically paradoxical attitudes toward the judiciary: they have often been suspicious of the power of unelected judges at the same time that they have relied on independent judicial institutions to resolve their deepest disputes.
Kathryn Cramer Brownell
Hollywood has always been political. Since its early days, it has intersected with national, state, and local politics. As a new entertainment industry attempting to gain a footing in a society of which it sat firmly on the outskirts, the Jewish industry leaders worked hard to advance the merits of their industry to a Christian political establishment. At the local and state level, film producers faced threats of censorship and potential regulation of more democratic spaces they provided for immigrants and working class patrons in theaters. As Hollywood gained economic and cultural influence, the political establishment took note, attempting to shape silver screen productions and deploy Hollywood’s publicity innovations for its own purposes. Over the course of the 20th century, industry leaders forged political connections with politicians from both parties to promote their economic interests, and politically motivated actors, directors, writers, and producers across the ideological spectrum used their entertainment skills to advance ideas and messages on and off the silver screen. At times this collaboration generated enthusiasm for its ability to bring new citizens into the electoral process. At other times, however, it created intense criticism and fears abounded that entertainment would undermine the democratic process with a focus on style over substance. As Hollywood personalities entered the political realm—for personal, professional, and political gain—the industry slowly reshaped American political life, bringing entertainment, glamor, and emotion to the political process and transforming how Americans communicate with their elected officials and, indeed, how they view their political leaders.
Kristin M. Szylvian
Federal housing policy has been primarily devoted to maintaining the economic stability and profitability of the private sector real estate, household finance, and home-building and supply industries since the administration of President Franklin D. Roosevelt (1933–1945). Until the 1970s, federal policy encouraged speculative residential development in suburban areas and extended segregation by race and class. The National Association of Home Builders, the National Association of Realtors, and other allied organizations strenuously opposed federal programs seeking to assist low- and middle-income households and the homeless by forcing recalcitrant suburbs to permit the construction of open-access, affordable dwellings and encouraging the rehabilitation of urban housing. During the 1980s, President Ronald Reagan, a Republican from California, argued it was the government, not the private sector, that was responsible for the gross inequities in social and economic indicators between residents of city, inner ring, and outlying suburban communities. The civic, religious, consumer, labor, and other community-based organizations that tried to mitigate the adverse effects of the “Reagan Revolution” on the affordable housing market lacked a single coherent view or voice. Since that time, housing has become increasingly unaffordable in many metropolitan areas, and segregation by race, income, and ethnicity is on the rise once again. If the home mortgage crisis that began in 2007 is any indication, housing will continue to be a divisive political, economic, and social issue in the foreseeable future.
The national housing goal of a “decent home in a suitable living environment for every American family” not only has yet to be realized, but many law makers now favor eliminating or further restricting federal commitment to its realization.
Sarah B. Snyder
In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.
Sean P. Harvey
“Race,” as a concept denoting a fundamental division of humanity and usually encompassing cultural as well as physical traits, was crucial in early America. It provided the foundation for the colonization of Native land, the enslavement of American Indians and Africans, and a common identity among socially unequal and ethnically diverse Europeans. Longstanding ideas and prejudices merged with aims to control land and labor, a dynamic reinforced by ongoing observation and theorization of non-European peoples. Although before colonization, neither American Indians, nor Africans, nor Europeans considered themselves unified “races,” Europeans endowed racial distinctions with legal force and philosophical and scientific legitimacy, while Natives appropriated categories of “red” and “Indian,” and slaves and freed people embraced those of “African” and “colored,” to imagine more expansive identities and mobilize more successful resistance to Euro-American societies. The origin, scope, and significance of “racial” difference were questions of considerable transatlantic debate in the age of Enlightenment and they acquired particular political importance in the newly independent United States.
Since the beginning of European exploration in the 15th century, voyagers called attention to the peoples they encountered, but European, American Indian, and African “races” did not exist before colonization of the so-called New World. Categories of “Christian” and “heathen” were initially most prominent, though observations also encompassed appearance, gender roles, strength, material culture, subsistence, and language. As economic interests deepened and colonies grew more powerful, classifications distinguished Europeans from “Negroes” or “Indians,” but at no point in the history of early America was there a consensus that “race” denoted bodily traits only. Rather, it was a heterogeneous compound of physical, intellectual, and moral characteristics passed on from one generation to another. While Europeans assigned blackness and African descent priority in codifying slavery, skin color was secondary to broad dismissals of the value of “savage” societies, beliefs, and behaviors in providing a legal foundation for dispossession.
“Race” originally denoted a lineage, such as a noble family or a domesticated breed, and concerns over purity of blood persisted as 18th-century Europeans applied the term—which dodged the controversial issue of whether different human groups constituted “varieties” or “species”—to describe a roughly continental distribution of peoples. Drawing upon the frameworks of scripture, natural and moral philosophy, and natural history, scholars endlessly debated whether different races shared a common ancestry, whether traits were fixed or susceptible to environmentally produced change, and whether languages or the body provided the best means to trace descent. Racial theorization boomed in the U.S. early republic, as some citizens found dispossession and slavery incompatible with natural-rights ideals, while others reconciled any potential contradictions through assurances that “race” was rooted in nature.
Benjamin C. Montoya
A fear of foreignness shaped the immigration foreign policies of the United States up to the end of World War II. US leaders perceived nonwhite peoples of Latin America, Asia, and Europe as racially inferior, and feared that contact with them, even annexation of their territories, would invite their foreign mores, customs, and ideologies into US society. This belief in nonwhite peoples’ foreignness also influenced US immigration policy, as Washington codified laws that prohibited the immigration of nonwhite peoples to the United States, even as immigration was deemed a net gain for a US economy that was rapidly industrializing from the late 19th century to the first half of the 20th century.
Ironically, this fear of foreignness fostered an aggressive US foreign policy for many of the years under study, as US leaders feared that European intervention into Latin America, for example, would undermine the United States’ regional hegemony. The fear of foreignness that seemed to oblige the United States to shore up its national security interests vis-à-vis European empires also demanded US intervention into the internal affairs of nonwhite nations. For US leaders, fear of foreignness was a two-sided coin: European aggression was encouraged by the internal instability of nonwhite nations, and nonwhite nations were unstable—and hence ripe pickings for Europe’s empires—because their citizens were racially inferior. To forestall both of these simultaneous foreign threats, the United States increasingly embedded itself into the political and economic affairs of foreign nations.
The irony of opportunity, of territorial acquisitions as well as immigrants who fed US labor markets, and fear, of European encroachment and the racial inferiority of nonwhite peoples, lay at the root of the immigration and foreign policies of the United States up to 1945.
Between 1820 and 1924, nearly thirty-six million immigrants entered the United States. Prior to the Civil War, the vast majority of immigrants were northern and western Europeans, though the West Coast received Chinese immigration from the late 1840s onward. In mid-century, the United States received an unprecedented influx of Irish and German immigrants, who included a large number of Catholics and the poor. At the turn of the 20th century, the major senders of immigrants shifted to southern and eastern Europe, and Asians and Mexicans made up a growing portion of newcomers. Throughout the long 19th century, urban settlement remained a popular option for immigrants, and they contributed to the social, cultural, political, economic, and physical growth of the cities they resided in. Foreign-born workers also provided much-needed labor for America’s industrial development. At the same time, intense nativism emerged in cities in opposition to the presence of foreigners, who appeared to be unfit for American society, threats to Americans’ jobs, or sources of urban problems such as poverty. Anti-immigrant sentiment resulted in the introduction of state and federal laws for preventing the immigration of undesirable foreigners, such as the poor, southern and eastern Europeans, and Asians. Cities constituted an integral part of the 19th-century American immigration experience.
The Immigration Act of 1924 was in large part the result of a deep political and cultural divide in America between heavily immigrant cities and far less diverse small towns and rural areas. The 1924 legislation, together with growing residential segregation, midcentury federal urban policy, and postwar suburbanization, undermined scores of ethnic enclaves in American cities between 1925 and the 1960s. The deportation of Mexicans and their American children during the Great Depression, the incarceration of West Coast Japanese Americans during World War II, and the wartime and postwar shift of so many jobs to suburban and Sunbelt areas also reshaped many US cities in these years. The Immigration Act of 1965, which enabled the immigration of large numbers of people from Asia, Latin America, and, eventually, Africa, helped to revitalize many depressed urban areas and inner-ring suburbs. In cities and suburbs across the country, the response to the new immigration since 1965 has ranged from welcoming to hostile. The national debate over immigration in the early 21st century reflects both familiar and newer cultural, linguistic, religious, racial, and regional rifts. However, urban areas with a history of immigrant incorporation remain the most politically supportive of such people, just as they were a century ago.
Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.
John P. Bowes
Indian removals as a topic primarily encompasses the relocation of Native American tribes from American-claimed states and territories east of the Mississippi River to lands west of the Mississippi River in the first half of the 19th century. The bill passed by Congress in May 1830 referred to as the Indian Removal Act is the legislative expression of the ideology upon which federal and state governments acted to accomplish the dispossession and relocation of tens of thousands of Native American peoples during that time. Through both treaty negotiations and coercion, federal officials used the authority of removal policies to obtain land cessions and resettle eastern Indians in what is known in the early 21st century as Kansas and Oklahoma. These actions, in conjunction with non-Indian population growth and western migration, made it extremely difficult, if not impossible, for any tribes to remain on their eastern lands. The Cherokee Trail of Tears, which entailed the forced removal of approximately fourteen thousand men, women, and children from Georgia starting in the summer of 1838 until the spring of 1839, remains the most well-known illustration of this policy and its impact. Yet the comprehensive histories of removals encompass the forced relocations of tens of thousands of indigenous men, women, and children from throughout the Southeast as well as the Old Northwest from the 1810s into the 1850s.
The history of American slavery began long before the first Africans arrived at Jamestown in 1619. Evidence from archaeology and oral tradition indicates that for hundreds, perhaps thousands, of years prior, Native Americans had developed their own forms of bondage. This fact should not be surprising, for most societies throughout history have practiced slavery. In her cross-cultural and historical research on comparative captivity, Catherine Cameron found that bondspeople composed 10 percent to 70 percent of the population of most societies, lending credence to Seymour Drescher’s assertion that “freedom, not slavery, was the peculiar institution.” If slavery is ubiquitous, however, it is also highly variable. Indigenous American slavery, rooted in warfare and diplomacy, was flexible, often offering its victims escape through adoption or intermarriage, and it was divorced from racial ideology, deeming all foreigners—men, women, and children, of whatever color or nation—potential slaves. Thus, Europeans did not introduce slavery to North America. Rather, colonialism brought distinct and evolving notions of bondage into contact with one another. At times, these slaveries clashed, but they also reinforced and influenced one another. Colonists, who had a voracious demand for labor and export commodities, exploited indigenous networks of captive exchange, producing a massive global commerce in Indian slaves. This began with the second voyage of Christopher Columbus in 1495 and extended in some parts of the Americas through the twentieth century. During this period, between 2 and 4 million Indians were enslaved. Elsewhere in the Americas, Indigenous people adapted Euro-American forms of bondage. In the Southeast, an elite class of Indians began to hold African Americans in transgenerational slavery and, by 1800, developed plantations that rivaled those of their white neighbors. The story of Native Americans and slavery is complicated: millions were victims, some were masters, and the nature of slavery changed over time and varied from one place to another. A significant and long overlooked aspect of American history, Indian slavery shaped colonialism, exacerbated Native population losses, figured prominently in warfare and politics, and influenced Native and colonial ideas about race and identity.
Although often attributed to the Odawa ogima, or headman, Pontiac, the conflict that bears his name was the work of a large and complicated network of Native people in the Ohio Valley, Great Lakes, and Mississippi Valley. Together Native Americans from this wide swath of North America identified their collective dissatisfaction of British Indian policy and, through careful negotiation and discussion, formulated a religious and political ideology that advocated for the Britons’ removal. In 1763, these diverse peoples carried off a successful military campaign that demonstrated Native sovereignty and power in these areas. Although falling short of its original goal of displacing the British, the coalition compelled the British Empire to change its policies and to show, outwardly at least, respect for Native peoples. Many of the peoples involved in the struggle would wage another such pan-Indian campaign against the United States a generation later.
In many ways, the anti-British campaign of 1761–1766 mirrored another anti-imperial campaign that followed a decade later. Like the American Revolution, the anti-British advocates of Pontiac’s War developed an ideology that specifically critiqued not only British policy but often questioned imperialism altogether, built an unstable and delicate coalition of diverse and sometimes antagonistic peoples, and sought to assert the participants’ independence from the British. However, the participants in Pontiac’s War were sovereign and autonomous indigenous nations, only recently and nominally allied to the British Empire, not British colonists, as in the American Revolution. Together these anti-British activists mounted a serious challenge to the British presence in the trans-Appalachian West and forced the British Empire to accede to many of their demands.
The eighty years from 1790 to 1870 were marked by dramatic economic and demographic changes in the United States. Cities in this period grew faster than the country as a whole, drawing migrants from the countryside and immigrants from overseas. This dynamism stemmed from cities’ roles as spearheads of commercial change and sites of new forms of production. Internal improvements such as canals and railroads expanded urban hinterlands in the early republic, while urban institutions such as banks facilitated market exchange. Both of these worked to the advantage of urban manufacturers. By paying low wages to workers performing repetitive tasks, manufacturers enlarged the market for their products but also engendered opposition from a workforce internally divided along lines of sex and race, and at times slavery and freedom. The Civil War affirmed the legitimacy of wage labor and enhanced the power of corporations, setting the stage for the postwar growth of large-scale, mechanized industry.
Mass transit has been part of the urban scene in the United States since the early 19th century. Regular steam ferry service began in New York City in the early 1810s and horse-drawn omnibuses plied city streets starting in the late 1820s. Expanding networks of horse railways emerged by the mid-19th century. The electric streetcar became the dominant mass transit vehicle a half century later. During this era, mass transit had a significant impact on American urban development. Mass transit’s importance in the lives of most Americans started to decline with the growth of automobile ownership in the 1920s, except for a temporary rise in transit ridership during World War II. In the 1960s, congressional subsidies began to reinvigorate mass transit and heavy-rail systems opened in several cities, followed by light rail systems in several others in the next decades. Today concerns about environmental sustainability and urban revitalization have stimulated renewed interest in the benefits of mass transit.