Thomas J. Sugrue
Racism in the United States has long been a national problem, not a regional phenomenon. The long and well-documented history of slavery, Jim Crow laws, and racial violence in the South overshadows the persistent reality of racial discrimination, systemic segregation, and entrenched inequality north of the Mason-Dixon line. From the mid-19th century forward, African Americans and their allies mounted a series of challenges to racially separate schools, segregated public accommodations, racially divided workplaces, endemic housing segregation, and discriminatory policing. The northern civil rights movement expanded dramatically in the aftermath of the Great Migration of blacks northward and the intensification of segregation in northern hotels, restaurants, and theaters, workplaces, housing markets, and schools in the early 20th century. During the Great Depression and World War II, emboldened civil rights organizations engaged in protest, litigation, and lobbying efforts to undermine persistent racial discrimination and segregation. Their efforts resulted in legal and legislative victories against racially separate and unequal institutions, particularly workplaces and stores. But segregated housing and schools remained more impervious to change. By the 1960s, many black activists in the North grew frustrated with the pace of change, even as they succeeded in increasing black representation in elected office, in higher education, and in certain sectors of the economy. In the late 20th century, civil rights activists launched efforts to fight the ongoing problem of police brutality and the rise of the prison-industrial complex. And they pushed, mostly through the courts, for the protection of the fragile gains of the civil rights era. The black freedom struggle in the North remained incomplete in the face of ongoing segregation, persistent racism, and ongoing racial inequality in employment, education, income, and wealth.
Claudrena N. Harold
The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.”
A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.
American cities expanded during the late 19th century, as industrial growth was fueled by the arrival of millions of immigrants and migrants. Poverty rates escalated, overwhelming existing networks of private charities. Progressive reformers created relief organizations and raised public awareness of urban poverty. The devastating effects of the Great Depression inspired greater focus on poverty from state and federal agencies. The Social Security Act, the greatest legacy of the New Deal, would provide a safety net for millions of Americans. During the postwar era of general prosperity, federal housing policies often reinforced and deepened racial and socioeconomic inequality and segregation. The 1960s War on Poverty created vital aid programs that expanded access to food, housing, and health care. These programs also prompted a rising tide of conservative backlash against perceived excesses. Fueled by such critical sentiments, the Reagan administration implemented dramatic cuts to assistance programs. Later, the Clinton administration further reformed welfare by tying aid to labor requirements. Throughout the 20th century, the urban homeless struggled to survive in hostile environments. Skid row areas housed the homeless for decades, providing shelter, food, and social interaction within districts that were rarely visited by the middle and upper classes. The loss of such spaces to urban renewal and gentrification in many cities left many of the homeless unsheltered and dislocated.
D. Bradford Hunt
Public housing emerged during the New Deal as a progressive effort to end the scourge of dilapidated housing in American cities. Reformers argued that the private market had failed to provide decent, safe, and affordable housing, and they convinced Congress to provide deep subsidies to local housing authorities to build and manage modern, low-cost housing projects for the working poor. Well-intentioned but ultimately misguided policy decisions encouraged large-scale developments, concentrated poverty and youth, and starved public housing of needed resources. Further, the antipathy of private interests to public competition and the visceral resistance of white Americans to racial integration saddled public housing with many enemies and few friends. While residents often formed tight communities and fought for improvements, stigmatization and neglect undermined the success of many projects; a sizable fraction became disgraceful and tangible symbols of systemic racism toward the nation’s African American poor. Federal policy had few answers and retreated in the 1960s, eventually making a neoliberal turn to embrace public-private partnerships for delivering affordable housing. Housing vouchers and tax credits effectively displaced the federal public housing program. In the 1990s, the Clinton administration encouraged the demolition and rebuilding of troubled projects using vernacular “New Urbanist” designs to house “mixed-income” populations. Policy problems, political weakness, and an ideology of homeownership in the United States meant that a robust, public-centered program of housing for use rather than profit could not be sustained.
Risa L. Goluboff and Adam Sorensen
The crime of vagrancy has deep historical roots in American law and legal culture. Originating in 16th-century England, vagrancy laws came to the New World with the colonists and soon proliferated throughout the British colonies and, later, the United States. Vagrancy laws took myriad forms, generally making it a crime to be poor, idle, dissolute, immoral, drunk, lewd, or suspicious. Vagrancy laws often included prohibitions on loitering—wandering around without any apparent lawful purpose—though some jurisdictions criminalized loitering separately. Taken together, vaguely worded vagrancy, loitering, and suspicious persons laws targeted objectionable “out of place” people rather than any particular conduct. They served as a ubiquitous tool for maintaining hierarchy and order in American society. Their application changed alongside perceived threats to the social fabric, at different times and places targeting the unemployed, labor activists, radical orators, cultural and sexual nonconformists, racial and religious minorities, civil rights protesters, and the poor. By the mid-20th century, vagrancy laws served as the basis for hundreds of thousands of arrests every year. But over the course of just two decades, the crime of vagrancy, virtually unquestioned for four hundred years, unraveled. Profound social upheaval in the 1960s produced a concerted effort against the vagrancy regime, and in 1972, the US Supreme Court invalidated the laws. Local authorities have spent the years since looking for alternatives to the many functions vagrancy laws once served.
While American gambling has a historical association with the lawlessness of the frontier and with the wasteful leisure practices of Southern planters, it was in large cities where American gambling first flourished as a form of mass leisure, and as a commercial enterprise of significant scale. In the urban areas of the Mid-Atlantic, the Northeast, and the upper Mid-West, for the better part of two centuries the gambling economy was deeply intertwined with municipal politics and governance, the practices of betting were a prominent feature of social life, and controversies over the presence of gambling both legal and illegal, were at the center of public debate. In New York and Chicago in particular, but also in Cleveland, Pittsburgh, Detroit, Baltimore, and Philadelphia, gambling channeled money to municipal police forces and sustained machine politics. In the eyes of reformers, gambling corrupted governance and corroded social and economic interactions. Big city gambling has changed over time, often in a manner reflecting important historical processes and transformations in economics, politics, and demographics. Yet irrespective of such change, from the onset of Northern urbanization during the 19th century, through much of the 20th century, gambling held steady as a central feature of city life and politics. From the poolrooms where recently arrived Irish New Yorkers bet on horseracing after the Civil War, to the corner stores where black and Puerto Rican New Yorkers bet on the numbers game in the 1960s, the gambling activity that covered the urban landscape produced argument and controversy, particularly with respect to drawing the line between crime and leisure, and over the question of where and to what ends the money of the gambling public should be directed.
Substantial numbers of Asian Americans and Asian immigrants moved into suburbs across the United States after World War II, bringing distinctive everyday lifeways, identities, worldviews, family types, and community norms that remade much of American suburbia. Although Asian Americans were excluded from suburbs on racial grounds since the late 19th century, American Cold War objectives in Asia and the Pacific and domestic American civil rights struggles afforded Asian Americans increased access to suburban housing in the 1950s, especially Chinese and Japanese Americans. Following passage of the Immigration Act of 1965 and the Fair Housing Act of 1968, new groups of Asian Americans, particularly Filipino, Vietnamese, Thai, Korean, and South Asian Indian, joined Chinese and Japanese Americans in settling in earnest into all kinds of suburban neighborhoods. At the turn of the 21st century, a majority of Asians resided in the suburbs, which also became the preferred gateway communities for new immigrants who often bypassed urban cores and moved straight to the suburbs when they arrived.
Entrance into highly racialized postwar suburbs defined by white middle-class norms and segregated white privilege did not, however, mean that Asian Americans gained entry or assimilated into whiteness. While many certainly aspired to and reinforced long-standing white suburban ideals, others revamped, contested, and outright fractured dominant notions of the suburban good life. By the 1980s Asian Americans of various ethnic and national backgrounds had transformed the sights, sounds, and smells of suburban landscapes throughout the country. They made claims on suburban space and asserted a “right to the suburb” through a range of social and cultural practices, often in physical places, especially shopping plazas, grocery stores, restaurants, religious centers, and schools. Yet as Asian Americans tried to become full-fledged participants in suburban culture and life, their presence, ethnic expressions, and ways of life sparked tensions with other mostly white suburbanites that led to heated debates over immigration, race, multiculturalism, and assimilation in American society.
The history of post-World War II Asian American suburban cultures highlights suburbia as a principal setting for Asian American experiences and the making of Asian American identity during the second half of the 20th century. More broadly, the Asian American experience reveals how control over the suburban ideal and the making of suburban space in the United States was and remains a contested, layered process. It also underscores the racial and ethnic diversification of metropolitan America and how pressing social, political, economic, and cultural issues in US society played out increasingly on the suburban stage. Moreover, Asian Americans built communities and social networks precisely the moment in which the authentic “American” community was supposedly in decline, providing a powerful counterpunch to those who lament nonwhite populations, particularly immigrants, for fracturing an otherwise unified American culture or sense of togetherness.
Robert R. Gioielli
By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.
Ansley T. Erickson
“Urban infrastructure” calls to mind railways, highways, and sewer systems. Yet the school buildings—red brick, limestone, or concrete, low-slung, turreted, or glass-fronted—that hold and seek to shape the city’s children are ubiquitous forms of infrastructure as well. Schools occupy one of the largest line items in a municipal budget, and as many as a fifth of a city’s residents spend the majority of their waking hours in school classrooms, hallways, and gymnasiums. In the 19th and 20th centuries urban educational infrastructure grew, supported by developing consensus for publicly funded and publicly governed schools (if rarely fully accessible to all members of the public). Even before state commitment to other forms of social welfare, from pensions to public health, and infrastructure, from transit to fire, schooling was a government function.
This commitment to public education ultimately was national, but schools in cities had their own story. Schooling in the United States is chiefly a local affair: Constitutional responsibility for education lies with the states; power is then further decentralized as states entrust decisions about school function and funding to school districts. School districts can be as small as a single town or a part of a city. Such localism is one reason that it is possible to speak about schools in U.S. cities as having a particular history, determined as much by the specificities of urban life as by national questions of citizenship, economy, religion, and culture.
While city schools have been distinct, they have also been nationally influential. Urban scale both allowed for and demanded the most extensive educational system-building. Urban growth and diversity galvanized innovation, via exploration in teaching methods, curriculum, and understanding of children and communities. And it generated intense conflict. Throughout U.S. history, urban residents from myriad social, political, religious, and economic positions have struggled to define how schools would operate, for whom, and who would decide.
During the 19th and 20th centuries, U.S. residents struggled over the purposes, funding, and governance of schools in cities shaped by capitalism, nativism, and white supremacy. They built a commitment to schooling as a public function of their cities, with many compromises and exclusions. In the 21st century, old struggles re-emerged in new form, perhaps raising the question of whether schools will continue as public, urban infrastructure.
Betsy A. Beasley
American cities have been transnational in nature since the first urban spaces emerged during the colonial period. Yet the specific shape of the relationship between American cities and the rest of the world has changed dramatically in the intervening years. In the mid-20th century, the increasing integration of the global economy within the American economy began to reshape US cities. In the Northeast and Midwest, the once robust manufacturing centers and factories that had sustained their residents—and their tax bases—left, first for the South and West, and then for cities and towns outside the United States, as capital grew more mobile and businesses sought lower wages and tax incentives elsewhere. That same global capital, combined with federal subsidies, created boomtowns in the once-rural South and West. Nationwide, city boosters began to pursue alternatives to heavy industry, once understood to be the undisputed guarantor of a healthy urban economy. Increasingly, US cities organized themselves around the service economy, both in high-end, white-collar sectors like finance, consulting, and education, and in low-end pink-collar and no-collar sectors like food service, hospitality, and health care. A new legal infrastructure related to immigration made US cities more racially, ethnically, and linguistically diverse than ever before.
At the same time, some US cities were agents of economic globalization themselves. Dubbed “global cities” by celebrants and critics of the new economy alike, these cities achieved power and prestige in the late 20th century not only because they had survived the ruptures of globalization but because they helped to determine its shape. By the end of the 20th century, cities that are not routinely listed among the “global city” elite jockeyed to claim “world-class” status, investing in high-end art, entertainment, technology, education, and health care amenities to attract and retain the high-income white-collar workers understood to be the last hope for cities hollowed out by deindustrialization and global competition. Today, the extreme differences between “global cities” and the rest of US cities, and the extreme socioeconomic stratification seen in cities of all stripes, is a key concern of urbanists.