Rioting in the United States since 1800 has adhered to three basic traditions: regulating communal morality, defending community from outside threats, and protesting government abuse of power. Typically, crowds have had the shared interests of class, group affiliation, geography, or a common enemy. Since American popular disorder has frequently served as communal policing, the state—especially municipal police—has had an important role in facilitating, constraining, or motivating unrest.
Rioting in the United States retained strong legitimacy and popular resonance from 1800 to the 1960s. In the decades after the founding, Americans adapted English traditions of restrained mobbing to more diverse, urban conditions. During the 19th century, however, rioting became more violent and ambitious as Americans—especially white men—asserted their right to use violence to police heterogeneous public space. In the 1840s and 1850s, whites combined the lynch mob with the disorderly crowd to create a lethal and effective instrument of white settler sovereignty both in the western territories and in the states. From the 1860s to the 1930s, white communities across the country, particularly in the South, used racial killings and pogroms to seize political power and establish and enforce Jim Crow segregation. Between the 1910s and the 1970s, African Americans and Latinos, increasingly living in cities, rioted to defend their communities against civilian and police violence. The frequency of rioting declined after the urban rebellions of the 1960s, partly due to the militarization of local police. Yet the continued use of aggressive police tactics against racial minorities has contributed to a surge in rioting in US cities in the early 21st century.
The transformation of post-industrial American life in the late 20th and early 21st centuries includes several economically robust metropolitan centers that stand as new models of urban and economic life, featuring well-educated populations that engage in professional practices in education, medical care, design and legal services, and artistic and cultural production. By the early 21st century, these cities dominated the nation’s consciousness economically and culturally, standing in for the most dynamic and progressive sectors of the economy, driven by collections of technical and creative spark. The origins of these academic and knowledge centers are rooted in the political economy, including investments shaped by federal policy and philanthropic ambition. Education and health care communities were and remain frequently economically robust but also rife with racial, economic, and social inequality, and riddled with resulting political tensions over development. These information communities fundamentally incubated and directed the proceeds of the new economy, but also constrained who accessed this new mode of wealth in the knowledge economy.
Jessica Ellen Sewell
From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used.
Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside.
In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.
Between 1820 and 1924, nearly thirty-six million immigrants entered the United States. Prior to the Civil War, the vast majority of immigrants were northern and western Europeans, though the West Coast received Chinese immigration from the late 1840s onward. In mid-century, the United States received an unprecedented influx of Irish and German immigrants, who included a large number of Catholics and the poor. At the turn of the 20th century, the major senders of immigrants shifted to southern and eastern Europe, and Asians and Mexicans made up a growing portion of newcomers. Throughout the long 19th century, urban settlement remained a popular option for immigrants, and they contributed to the social, cultural, political, economic, and physical growth of the cities they resided in. Foreign-born workers also provided much-needed labor for America’s industrial development. At the same time, intense nativism emerged in cities in opposition to the presence of foreigners, who appeared to be unfit for American society, threats to Americans’ jobs, or sources of urban problems such as poverty. Anti-immigrant sentiment resulted in the introduction of state and federal laws for preventing the immigration of undesirable foreigners, such as the poor, southern and eastern Europeans, and Asians. Cities constituted an integral part of the 19th-century American immigration experience.
Kristin M. Szylvian
Federal housing policy has been primarily devoted to maintaining the economic stability and profitability of the private sector real estate, household finance, and home-building and supply industries since the administration of President Franklin D. Roosevelt (1933–1945). Until the 1970s, federal policy encouraged speculative residential development in suburban areas and extended segregation by race and class. The National Association of Home Builders, the National Association of Realtors, and other allied organizations strenuously opposed federal programs seeking to assist low- and middle-income households and the homeless by forcing recalcitrant suburbs to permit the construction of open-access, affordable dwellings and encouraging the rehabilitation of urban housing. During the 1980s, President Ronald Reagan, a Republican from California, argued it was the government, not the private sector, that was responsible for the gross inequities in social and economic indicators between residents of city, inner ring, and outlying suburban communities. The civic, religious, consumer, labor, and other community-based organizations that tried to mitigate the adverse effects of the “Reagan Revolution” on the affordable housing market lacked a single coherent view or voice. Since that time, housing has become increasingly unaffordable in many metropolitan areas, and segregation by race, income, and ethnicity is on the rise once again. If the home mortgage crisis that began in 2007 is any indication, housing will continue to be a divisive political, economic, and social issue in the foreseeable future.
The national housing goal of a “decent home in a suitable living environment for every American family” not only has yet to be realized, but many law makers now favor eliminating or further restricting federal commitment to its realization.
Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.
Rachel Hope Cleves
The task of recovering the history of same-sex love among early American women faces daunting challenges of definition and sources. Modern conceptions of same-sex sexuality did not exist in early America, but alternative frameworks did. Many indigenous nations had social roles for female-bodied individuals who lived as men, performed male work, and acquired wives. Early Christian settlers viewed sexual encounters between women as sodomy, but also valued loving dyadic bonds between religious women. Primary sources indicate that same-sex sexual practices existed within western and southern African societies exploited by the slave trade, but little more is known. The word “lesbian” has been used to signify erotics between women since roughly the 10th century, but historians must look to women who led lesbian-like lives in early America rather than to women who self-identified as lesbians. Stories of female husbands who passed as men and married other women were popular in the 18th century. Tales of passing women who served in the military, in the navy, and as pirates also amused audiences and raised the spectre of same-sex sexuality. Some female religious leaders trespassed conventional gender roles and challenged the marital sexual order. Other women conformed to female gender roles, but constructed loving female households. 18th-century pornography depicting lesbian sexual encounters indicates that early Americans were familiar with the concept of sex between women. A few court records exist from prosecutions of early American women for engaging in lewd acts together. Far more common, by the end of the 18th century, were female-authored letters and diaries describing the culture of romantic friendship, which sometimes extended to sexual intimacy. Later in the 19th century, romantic friendship became an important ingredient in the development of lesbian culture and identity.
The United States–Mexico War was the first war in which the United States engaged in a conflict with a foreign nation for the purpose of conquest. It was also the first conflict in which trained soldiers (from West Point) played a large role. The war’s end transformed the United States into a continental nation as it acquired a vast portion of Mexico’s northern territories. In addition to shaping U.S.–Mexico relations into the present, the conflict also led to the forcible incorporation of Mexicans (who became Mexican Americans) as the nation’s first Latinos. Yet, the war has been identified as the nation’s “forgotten war” because few Americans know the causes and consequences of this conflict. Within fifteen years of the war’s end, the conflict faded from popular memory, but it did not disappear, due to the outbreak of the U.S. Civil War. By contrast, the U.S.–Mexico War is prominently remembered in Mexico as having caused the loss of half of the nation’s territory, and as an event that continues to shape Mexico’s relationship with the United States. Official memories (or national histories) of war affect international relations, and also shape how each nation’s population views citizens of other countries. Not surprisingly, there is a stark difference in the ways that American citizens and Mexican citizens remember and forget the war (e.g., Americans refer to the “Mexican American War” or the “U.S.–Mexican War,” for example, while Mexicans identify the conflict as the “War of North American Intervention”).
Carolyn Podruchny and Stacy Nation-Knapper
From the 15th century to the present, the trade in animal fur has been an economic venture with far-reaching consequences for both North Americans and Europeans (in which North Americans of European descent are included). One of the earliest forms of exchange between Europeans and North Americans, the trade in fur was about the garment business, global and local politics, social and cultural interaction, hunting, ecology, colonialism, gendered labor, kinship networks, and religion. European fashion, specifically the desire for hats that marked male status, was a primary driver for the global fur-trade economy until the late 19th century, while European desires for marten, fox, and other luxury furs to make and trim clothing comprised a secondary part of the trade. Other animal hides including deer and bison provided sturdy leather from which belts for the machines of the early Industrial Era were cut. European cloth, especially cotton and wool, became central to the trade for Indigenous peoples who sought materials that were lighter and dried faster than skin clothing. The multiple perspectives on the fur trade included the European men and indigenous men and women actually conducting the trade; the indigenous male and female trappers; European trappers; the European men and women producing trade goods; indigenous “middlemen” (men and women) who were conducting their own fur trade to benefit from European trade companies; laborers hauling the furs and trade goods; all those who built, managed, and sustained trading posts located along waterways and trails across North America; and those Europeans who manufactured and purchased the products made of fur and the trade goods desired by Indigenous peoples. As early as the 17th century, European empires used fur-trade monopolies to establish colonies in North America and later fur trading companies brought imperial trading systems inland, while Indigenous peoples drew Europeans into their own patterns of trade and power. By the 19th century, the fur trade had covered most of the continent and the networks of business, alliances, and families, and the founding of new communities led to new peoples, including the Métis, who were descended from the mixing of European and Indigenous peoples. Trading territories, monopolies, and alliances with Indigenous peoples shaped how European concepts of statehood played out in the making of European-descended nation-states, and the development of treaties with Indigenous peoples. The fur trade flourished in northern climes until well into the 20th century, after which time economic development, resource exploitation, changes in fashion, and politics in North America and Europe limited its scope and scale. Many Indigenous people continue today to hunt and trap animals and have fought in courts for Indigenous rights to resources, land, and sovereignty.
By serving travelers and commerce, roads and streets unite people and foster economic growth. But as they develop, roads and streets also disrupt old patterns, upset balances of power, and isolate some as they serve others. The consequent disagreements leave historical records documenting social struggles that might otherwise be overlooked. For long-distance travel in America before the middle of the 20th century, roads were generally poor alternatives, resorted to when superior means of travel, such as river and coastal vessels, canal boats, or railroads were unavailable. Most roads were unpaved, unmarked, and vulnerable to the effects of weather. Before the railroads, for travelers willing to pay the toll, rare turnpikes and plank roads could be much better. Even in towns, unpaved streets were common until the late 19th century, and persisted into the 20th. In the late 19th century, rapid urban growth, rural free delivery of the mails, and finally the proliferation of electric railways and bicycling contributed to growing pressure for better roads and streets. After 1910, the spread of the automobile accelerated the trend, but only with great controversy, especially in cities. Partly in response to the controversy, advocates of the automobile organized to promote state and county motor highways funded substantially by gasoline taxes; such roads were intended primarily for motor vehicles. In the 1950s, massive federal funds accelerated the trend; by then, motor vehicles were the primary transportation mode for both long and short distances. The consequences have been controversial, and alternatives have been attracting growing interest.