You are looking at 101-120 of 364 articles
The issue of genocide and American Indian history has been contentious. Many writers see the massive depopulation of the indigenous population of the Americas after 1492 as a clear-cut case of the genocide. Other writers, however, contend that European and U.S. actions toward Indians were deplorable but were rarely if ever genocidal. To a significant extent, disagreements about the pervasiveness of genocide in the history of the post-Columbian Western Hemisphere, in general, and U.S. history, in particular, pivot on definitions of genocide. Conservative definitions emphasize intentional actions and policies of governments that result in very large population losses, usually from direct killing. More liberal definitions call for less stringent criteria for intent, focusing more on outcomes. They do not necessarily require direct sanction by state authorities; rather, they identify societal forces and actors. They also allow for several intersecting forces of destruction, including dispossession and disease. Because debates about genocide easily devolve into quarrels about definitions, an open-ended approach to the question of genocide that explores several phases and events provides the possibility of moving beyond the present stalemate. However one resolves the question of genocide in American Indian history, it is important to recognize that European and U.S. settler colonial projects unleashed massively destructive forces on Native peoples and communities. These include violence resulting directly from settler expansion, intertribal violence (frequently aggravated by colonial intrusions), enslavement, disease, alcohol, loss of land and resources, forced removals, and assaults on tribal religion, culture, and language. The configuration and impact of these forces varied considerably in different times and places according to the goals of particular colonial projects and the capacities of colonial societies and institutions to pursue them. The capacity of Native people and communities to directly resist, blunt, or evade colonial invasions proved equally important.
Gentrification is one of the most controversial issues in American cities today. But it also remains one of the least understood. Few agree on how to define it or whether it is boon or curse for cities. Gentrification has changed over time and has a history dating back to the early 20th century. Historically, gentrification has had a smaller demographic impact on American cities than suburbanization or immigration. But since the late 1970s, gentrification has dramatically reshaped cities like Seattle, San Francisco, and Boston. Furthermore, districts such as the French Quarter in New Orleans, New York City’s Greenwich Village, and Georgetown in Washington DC have had an outsized influence on the political, cultural, and architectural history of cities. Gentrification thus must be examined alongside suburbanization as one of the major historical trends shaping the 20th-century American metropolis.
B. Alex Beasley
American cities have been transnational in nature since the first urban spaces emerged during the colonial period. Yet the specific shape of the relationship between American cities and the rest of the world has changed dramatically in the intervening years. In the mid-20th century, the increasing integration of the global economy within the American economy began to reshape US cities. In the Northeast and Midwest, the once robust manufacturing centers and factories that had sustained their residents—and their tax bases—left, first for the South and West, and then for cities and towns outside the United States, as capital grew more mobile and businesses sought lower wages and tax incentives elsewhere. That same global capital, combined with federal subsidies, created boomtowns in the once-rural South and West. Nationwide, city boosters began to pursue alternatives to heavy industry, once understood to be the undisputed guarantor of a healthy urban economy. Increasingly, US cities organized themselves around the service economy, both in high-end, white-collar sectors like finance, consulting, and education, and in low-end pink-collar and no-collar sectors like food service, hospitality, and health care. A new legal infrastructure related to immigration made US cities more racially, ethnically, and linguistically diverse than ever before.
At the same time, some US cities were agents of economic globalization themselves. Dubbed “global cities” by celebrants and critics of the new economy alike, these cities achieved power and prestige in the late 20th century not only because they had survived the ruptures of globalization but because they helped to determine its shape. By the end of the 20th century, cities that are not routinely listed among the “global city” elite jockeyed to claim “world-class” status, investing in high-end art, entertainment, technology, education, and health care amenities to attract and retain the high-income white-collar workers understood to be the last hope for cities hollowed out by deindustrialization and global competition. Today, the extreme differences between “global cities” and the rest of US cities, and the extreme socioeconomic stratification seen in cities of all stripes, is a key concern of urbanists.
Erik Gellman and Margaret Rung
From the late 1920s through the 1930s, countries on every inhabited continent suffered through a dramatic and wrenching economic contraction termed the Great Depression, an economic collapse that has come to represent the nadir of modern economic history. With national unemployment reaching well into double digits for over a decade, productivity levels falling by half, prices severely depressed, and millions of Americans without adequate food, shelter or clothing, the United States experienced some of the Great Depression’s severest consequences. The crisis left deep physical, psychological, political, social, and cultural impressions on the national landscape. It encouraged political reform and reaction, renewed labor activism, spurred migration, unleashed grass-roots movements, inspired cultural experimentation, and challenged family structures and gender roles.
Christopher R. Reed
The unanticipated and massive migration of half a million African Americans between 1916 and 1918 from the racially oppressive South to the welcoming North surprised the nation. Directly resulting from the advent of the First World War, the movement of these able-bodied workers provided essential labor to maintain wartime production that sustained the Allied war effort. One-tenth of the people who surged north headed to and remained in Chicago, where their presence challenged the status quo in the areas of employment, external race relations, internal race arrangements, politics, housing, and recreation. Once in the Windy City, this migrant-influenced labor pool expanded with the addition of resident blacks to form the city’s first African American industrial proletariat. Wages for both men and women increased compared to what they had been earning in the South, and local businesses were ready and willing to accommodate these new consumers. A small black business sector became viable and was able to support two banks, and by the mid-1920s, there were multiple stores along Chicago’s State Street forming a virtual “Black Wall Street.” An extant political submachine within Republic Party ranks also increased its power and influence in repeated electoral contests. Importantly, upon scrutiny, the purported social conflict between the Old Settler element and the newcomers was shown to be overblown and inconsequential to black progress.
Recent revisionist scholarship over the past two decades has served to minimize the first phase of northward movement and has positioned it within the context of a half-century phenomenon under the labels of the “Second Great Migration” and the “Great Black Migration.” No matter what the designation, the voluntary movement of five to six million blacks from what had been their traditional home to the uncertainty of the North and West between the First World War and the Vietnam conflict stands as both a condemnation of regional oppression of the human spirit and aspirations of millions, and a demonstration of group courage in taking on new challenges in new settings. Although Chicago would prove to be “no crystal stair,” it was on many occasions a land of hope and promise for migrants throughout the past century.
During the 20th century, the black population of the United States transitioned from largely rural to mostly urban. In the early 1900s the majority of African Americans lived in rural, agricultural areas. Depictions of black people in popular culture often focused on pastoral settings, like the cotton fields of the rural South. But a dramatic shift occurred during the Great Migrations (1914–1930 and 1941–1970) when millions of rural black southerners relocated to US cities.
Motivated by economic opportunities in urban industrial areas during World Wars I and II, African Americans opted to move to southern cities as well as to urban centers in the Northeast, Midwest, and West Coast. New communities emerged that contained black social and cultural institutions, and musical and literary expressions flourished. Black migrants who left the South exercised voting rights, sending the first black representatives to Congress in the 20th century. Migrants often referred to themselves as “New Negroes,” pointing to their social, political, and cultural achievements, as well as their use of armed self-defense during violent racial confrontations, as evidence of their new stance on race.
Philippe R. Girard
Haiti (known as Saint-Domingue until it gained its independence from France in 1804) had a noted economic and political impact on the United States during the era of the American Revolution, when it forced U.S. statesmen to confront issues they had generally avoided, most prominently racism and slavery. But the impact of the Haitian Revolution was most tangible in areas like commerce, territorial expansion, and diplomacy. Saint-Domingue served as a staging ground for the French military and navy during the American Revolution and provided troops to the siege of Savannah in 1779. It became the United States’ second-largest commercial partner during the 1780s and 1790s. After Saint-Domingue’s slaves revolted in 1791, many of its inhabitants found refuge in the United States, most notably in Philadelphia, Charleston, and New Orleans. Fears (or hopes) that the slave revolt would spread to the United States were prevalent in public opinion. As Saint-Domingue achieved quasi-autonomous status under the leadership of Toussaint Louverture, it occupied a central place in the diplomacy of John Adams and Thomas Jefferson. The Louisiana Purchase was made possible in part by the failure of a French expedition to Saint-Domingue in 1802–1803. Bilateral trade declined after Saint-Domingue acquired its independence from France in 1804 (after which Saint-Domingue became known as Haiti), but Haiti continued to loom large in the African-American imagination, and there were several attempts to use Haiti as a haven for U.S. freedmen. The U.S. diplomatic recognition of Haiti also served as a reference point for antebellum debates on slavery, the slave trade, and the status of free people of color in the United States.
Sworn in as the 33rd President of the United States following Franklin D. Roosevelt’s death in April 1945, Harry S. Truman faced the daunting tasks of winning the war and ensuring future peace and stability. Chided by critics for his lack of foreign policy experience but championed by supporters for his straightforward decision-making, Truman guided the United States from World War to Cold War. The Truman presidency marked a new era in American foreign relations, with the United States emerging from World War II unmatched in economic strength and military power. The country assumed a leadership position in a postwar world primarily shaped by growing antagonism with the Soviet Union. Truman pursued an interventionist foreign policy that took measures to contain Soviet influence in Europe and stem the spread of communism in Asia. Under his leadership, the United States witnessed the dawn of the atomic age, approved billions of dollars in economic aid to rebuild Europe, supported the creation of multilateral organizations such as the United Nations and North Atlantic Treaty Organization, recognized the state of Israel, and intervened in the Korean peninsula. The challenges Truman confronted and the policies he implemented laid the foundation for 20th-century US foreign relations throughout the Cold War and beyond.
The Haymarket Riot and Conspiracy of 1886 is a landmark in American social and political history. On May 4, 1886, during an open-air meeting near Haymarket Square in Chicago, someone threw a dynamite bomb into a squad of police, sparking a riot that resulted in the deaths of seven police officers and at least four rioters. Eight anarchists were brought to trial. Though the bomb-thrower was never apprehended, the eight radical leaders were charged as accessories before the fact for conspiring to murder the police. After the longest criminal trial in Illinois history up to that time, seven men were convicted and condemned to death and one to a long prison term. After all appeals were exhausted, four were executed, one cheated the hangman with a jail cell suicide, and the death sentences of two others were commuted to life imprisonment (all three incarcerated men were later pardoned by Governor John Peter Altgeld in 1892).
The Haymarket bombing and trial marked a pivotal moment in the history of American social movements. It sparked the nation’s first red scare whose fury disrupted even moderately leftist movements for a generation. It drove the nation’s labor unions onto a more conservative path than they had been heading before the bombing. The worldwide labor campaign for clemency for the convicted men became the foundation for the institution of International Workers’ Day on May 1, a holiday ironically observed in most countries except for the United States. It also began a tradition within the American left of memorializing the Haymarket defendants as the first martyrs to their cause.
Thomas Alan Schwartz
Henry Kissinger was the most famous and most controversial American diplomat of the second half of the 20th century. Escaping Nazi persecution in the 1930s, serving in the American Army of occupation in Germany after 1945, and then pursuing a successful academic career at Harvard University, Kissinger had already achieved national prominence as a foreign policy analyst and defense intellectual when he was appointed national security adviser by President Richard Nixon in January 1969. Kissinger quickly became the president’s closest adviser on foreign affairs and worked with Nixon to change American foreign policy in response to domestic upheaval caused by the Vietnam War in the late 1960s and early 1970s. Nixon and Kissinger’s initiatives, primarily détente with the Soviet Union, the opening to the People’s Republic of China, and ending American involvement in the Vietnam War, received strong domestic support and helped to bring about Nixon’s re-election landslide in 1972. In the wake of the Watergate scandal, Nixon appointed Kissinger secretary of state in August 1973. As Nixon’s capacity to govern deteriorated, Kissinger assumed all-but presidential powers, even putting American forces on alert during the Yom Kippur war and then engaging in “shuttle diplomacy” in the Middle East, achieving the first-ever agreements between Israel and Egypt and Israel and Syria. Kissinger retained a dominating influence over foreign affairs during the presidency of Gerald Ford, even as he became a lightning rod for critics on both the left and right of the political spectrum. Although out of public office after 1977, Kissinger remained in the public eye as a foreign policy commentator, wrote three volumes of memoirs as well as other substantial books on diplomacy, and created a successful international business-consulting firm. His only governmental positions were as chair of the Commission on Central America in 1983–1984 and a brief moment on the 9/11 Commission in 2002.
The United States is a nation built on credit, both public and private. This article focuses on private credit: that is, credit extended to businesses and consumers by private entities such as banks, other businesses, and retail stores. Business credit involves short-term lending for items such as inventories, payroll, and the like; and long-term lending for the building of factories, offices, and other physical plant. Trade credit, bank loans, bonds, and commercial paper are all forms of business credit. Consumer credit is extended to individuals or households to fund purchases ranging from basic necessities to homes. Informal store credits, installment sales, personal loans from banks and other institutions, credit cards, home mortgages, and student loans are forms of consumer credit.
Until the 20th century, the federal government remained mostly uninvolved in the private credit markets. Then, after World War I and especially during the Great Depression, the government deliberately expanded the credit available for certain targeted groups, such as farmers and home buyers. After World War II the government helped to expand lending even further, this time to small businesses and students. Mostly the government accomplished its goal not through lending directly but by insuring the loans made by private entities, thereby encouraging them to make more loans. In the case of home mortgages and student loans, the government took the lead in creating a national market for securitized debt—debt that is turned into securities, such as bonds, and offered to investors—through the establishment of government-sponsored enterprises, nicknamed Fannie Mae (1938), Ginnie Mae (1968), Freddie Mac (1970), and Sallie Mae (1972). Innovations such as these by businesses and government made credit increasingly available to ordinary people, whose attitudes toward borrowing changed accordingly.
Timothy S. Huebner
The Supreme Court of the United States stands at the head of the nation’s judicial system. Created in Article III of the Constitution of 1787 but obscured by the other branches of government during the first few decades of its history, the Court came into its own as a co-equal branch in the early 19th century. Its exercise of judicial review—the power that it claimed to determine the constitutionality of legislative acts—gave the Court a unique status as the final arbiter of the nation’s constitutional conflicts. From the slavery question during the antebellum era to abortion and gay rights in more recent times, the Court has decided cases brought to it by individual litigants, and in doing so has shaped American constitutional and legal development. Composed of unelected justices who serve “during good behavior,” the Court’s rise in stature has not gone uncontested. Throughout the nation’s history, Congress, the president, and organized interest groups have all attempted to influence the Court’s jurisdiction, composition, and decision making. The Court’s prominence reflects Americans’ historically paradoxical attitudes toward the judiciary: they have often been suspicious of the power of unelected judges at the same time that they have relied on independent judicial institutions to resolve their deepest disputes.
Kristin M. Szylvian
Federal housing policy has been primarily devoted to maintaining the economic stability and profitability of the private sector real estate, household finance, and home-building and supply industries since the administration of President Franklin D. Roosevelt (1933–1945). Until the 1970s, federal policy encouraged speculative residential development in suburban areas and extended segregation by race and class. The National Association of Home Builders, the National Association of Realtors, and other allied organizations strenuously opposed federal programs seeking to assist low- and middle-income households and the homeless by forcing recalcitrant suburbs to permit the construction of open-access, affordable dwellings and encouraging the rehabilitation of urban housing. During the 1980s, President Ronald Reagan, a Republican from California, argued it was the government, not the private sector, that was responsible for the gross inequities in social and economic indicators between residents of city, inner ring, and outlying suburban communities. The civic, religious, consumer, labor, and other community-based organizations that tried to mitigate the adverse effects of the “Reagan Revolution” on the affordable housing market lacked a single coherent view or voice. Since that time, housing has become increasingly unaffordable in many metropolitan areas, and segregation by race, income, and ethnicity is on the rise once again. If the home mortgage crisis that began in 2007 is any indication, housing will continue to be a divisive political, economic, and social issue in the foreseeable future.
The national housing goal of a “decent home in a suitable living environment for every American family” not only has yet to be realized, but many law makers now favor eliminating or further restricting federal commitment to its realization.
Sarah B. Snyder
In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.
Sean P. Harvey
“Race,” as a concept denoting a fundamental division of humanity and usually encompassing cultural as well as physical traits, was crucial in early America. It provided the foundation for the colonization of Native land, the enslavement of American Indians and Africans, and a common identity among socially unequal and ethnically diverse Europeans. Longstanding ideas and prejudices merged with aims to control land and labor, a dynamic reinforced by ongoing observation and theorization of non-European peoples. Although before colonization, neither American Indians, nor Africans, nor Europeans considered themselves unified “races,” Europeans endowed racial distinctions with legal force and philosophical and scientific legitimacy, while Natives appropriated categories of “red” and “Indian,” and slaves and freed people embraced those of “African” and “colored,” to imagine more expansive identities and mobilize more successful resistance to Euro-American societies. The origin, scope, and significance of “racial” difference were questions of considerable transatlantic debate in the age of Enlightenment and they acquired particular political importance in the newly independent United States.
Since the beginning of European exploration in the 15th century, voyagers called attention to the peoples they encountered, but European, American Indian, and African “races” did not exist before colonization of the so-called New World. Categories of “Christian” and “heathen” were initially most prominent, though observations also encompassed appearance, gender roles, strength, material culture, subsistence, and language. As economic interests deepened and colonies grew more powerful, classifications distinguished Europeans from “Negroes” or “Indians,” but at no point in the history of early America was there a consensus that “race” denoted bodily traits only. Rather, it was a heterogeneous compound of physical, intellectual, and moral characteristics passed on from one generation to another. While Europeans assigned blackness and African descent priority in codifying slavery, skin color was secondary to broad dismissals of the value of “savage” societies, beliefs, and behaviors in providing a legal foundation for dispossession.
“Race” originally denoted a lineage, such as a noble family or a domesticated breed, and concerns over purity of blood persisted as 18th-century Europeans applied the term—which dodged the controversial issue of whether different human groups constituted “varieties” or “species”—to describe a roughly continental distribution of peoples. Drawing upon the frameworks of scripture, natural and moral philosophy, and natural history, scholars endlessly debated whether different races shared a common ancestry, whether traits were fixed or susceptible to environmentally produced change, and whether languages or the body provided the best means to trace descent. Racial theorization boomed in the U.S. early republic, as some citizens found dispossession and slavery incompatible with natural-rights ideals, while others reconciled any potential contradictions through assurances that “race” was rooted in nature.
Benjamin C. Montoya
A fear of foreignness shaped the immigration foreign policies of the United States up to the end of World War II. US leaders perceived nonwhite peoples of Latin America, Asia, and Europe as racially inferior, and feared that contact with them, even annexation of their territories, would invite their foreign mores, customs, and ideologies into US society. This belief in nonwhite peoples’ foreignness also influenced US immigration policy, as Washington codified laws that prohibited the immigration of nonwhite peoples to the United States, even as immigration was deemed a net gain for a US economy that was rapidly industrializing from the late 19th century to the first half of the 20th century.
Ironically, this fear of foreignness fostered an aggressive US foreign policy for many of the years under study, as US leaders feared that European intervention into Latin America, for example, would undermine the United States’ regional hegemony. The fear of foreignness that seemed to oblige the United States to shore up its national security interests vis-à-vis European empires also demanded US intervention into the internal affairs of nonwhite nations. For US leaders, fear of foreignness was a two-sided coin: European aggression was encouraged by the internal instability of nonwhite nations, and nonwhite nations were unstable—and hence ripe pickings for Europe’s empires—because their citizens were racially inferior. To forestall both of these simultaneous foreign threats, the United States increasingly embedded itself into the political and economic affairs of foreign nations.
The irony of opportunity, of territorial acquisitions as well as immigrants who fed US labor markets, and fear, of European encroachment and the racial inferiority of nonwhite peoples, lay at the root of the immigration and foreign policies of the United States up to 1945.
Between 1820 and 1924, nearly thirty-six million immigrants entered the United States. Prior to the Civil War, the vast majority of immigrants were northern and western Europeans, though the West Coast received Chinese immigration from the late 1840s onward. In mid-century, the United States received an unprecedented influx of Irish and German immigrants, who included a large number of Catholics and the poor. At the turn of the 20th century, the major senders of immigrants shifted to southern and eastern Europe, and Asians and Mexicans made up a growing portion of newcomers. Throughout the long 19th century, urban settlement remained a popular option for immigrants, and they contributed to the social, cultural, political, economic, and physical growth of the cities they resided in. Foreign-born workers also provided much-needed labor for America’s industrial development. At the same time, intense nativism emerged in cities in opposition to the presence of foreigners, who appeared to be unfit for American society, threats to Americans’ jobs, or sources of urban problems such as poverty. Anti-immigrant sentiment resulted in the introduction of state and federal laws for preventing the immigration of undesirable foreigners, such as the poor, southern and eastern Europeans, and Asians. Cities constituted an integral part of the 19th-century American immigration experience.
The Immigration Act of 1924 was in large part the result of a deep political and cultural divide in America between heavily immigrant cities and far less diverse small towns and rural areas. The 1924 legislation, together with growing residential segregation, midcentury federal urban policy, and postwar suburbanization, undermined scores of ethnic enclaves in American cities between 1925 and the 1960s. The deportation of Mexicans and their American children during the Great Depression, the incarceration of West Coast Japanese Americans during World War II, and the wartime and postwar shift of so many jobs to suburban and Sunbelt areas also reshaped many US cities in these years. The Immigration Act of 1965, which enabled the immigration of large numbers of people from Asia, Latin America, and, eventually, Africa, helped to revitalize many depressed urban areas and inner-ring suburbs. In cities and suburbs across the country, the response to the new immigration since 1965 has ranged from welcoming to hostile. The national debate over immigration in the early 21st century reflects both familiar and newer cultural, linguistic, religious, racial, and regional rifts. However, urban areas with a history of immigrant incorporation remain the most politically supportive of such people, just as they were a century ago.
Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.
John P. Bowes
Indian removals as a topic primarily encompasses the relocation of Native American tribes from American-claimed states and territories east of the Mississippi River to lands west of the Mississippi River in the first half of the 19th century. The bill passed by Congress in May 1830 referred to as the Indian Removal Act is the legislative expression of the ideology upon which federal and state governments acted to accomplish the dispossession and relocation of tens of thousands of Native American peoples during that time. Through both treaty negotiations and coercion, federal officials used the authority of removal policies to obtain land cessions and resettle eastern Indians in what is known in the early 21st century as Kansas and Oklahoma. These actions, in conjunction with non-Indian population growth and western migration, made it extremely difficult, if not impossible, for any tribes to remain on their eastern lands. The Cherokee Trail of Tears, which entailed the forced removal of approximately fourteen thousand men, women, and children from Georgia starting in the summer of 1838 until the spring of 1839, remains the most well-known illustration of this policy and its impact. Yet the comprehensive histories of removals encompass the forced relocations of tens of thousands of indigenous men, women, and children from throughout the Southeast as well as the Old Northwest from the 1810s into the 1850s.