Jessica Ellen Sewell
From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used.
Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside.
In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.
Puerto Rican migrants have resided in the United States since before the Spanish-Cuban-American War of 1898, when the United States took possession of the island of Puerto Rico as part of the Treaty of Paris. After the war, groups of Puerto Ricans began migrating to the United States as contract laborers, first to sugarcane plantations in Hawaii, and then to other destinations on the mainland. After the Jones Act of 1917 extended U.S. citizenship to islanders, Puerto Ricans migrated to the United States in larger numbers, establishing their largest base in New York City. Over the course of the 1920s and 1930s, a vibrant and heterogeneous colonia developed there, and Puerto Ricans participated actively both in local politics and in the increasingly contentious politics of their homeland, whose status was indeterminate until it became a commonwealth in 1952. The Puerto Rican community in New York changed dramatically after World War II, accommodating up to fifty thousand new migrants per year during the peak of the “great migration” from the island. Newcomers faced intense discrimination and marginalization in this era, defined by both a Cold War ethos and liberal social scientists’ interest in the “Puerto Rican problem.”
Puerto Rican migrant communities in the 1950s and 1960s—now rapidly expanding into the Midwest, especially Chicago, and into New Jersey, Connecticut, and Philadelphia—struggled with inadequate housing and discrimination in the job market. In local schools, Puerto Rican children often faced a lack of accommodation of their need for English language instruction. Most catastrophic for Puerto Rican communities, on the East Coast particularly, was the deindustrialization of the labor market over the course of the 1960s. By the late 1960s, in response to these conditions and spurred by the civil rights, Black Power, and other social movements, young Puerto Ricans began organizing and protesting in large numbers. Their activism combined a radical approach to community organizing with Puerto Rican nationalism and international anti-imperialism. The youth were not the only activists in this era. Parents in New York had initiated, together with their African American neighbors, a “community control” movement that spanned the late 1960s and early 1970s; and many other adult activists pushed the politics of the urban social service sector—the primary institutions in many impoverished Puerto Rican communities—further to the left.
By the mid-1970s, urban fiscal crises and the rising conservative backlash in national politics dealt another blow to many Puerto Rican communities in the United States. The Puerto Rican population as a whole was now widely considered part of a national “underclass,” and much of the political energy of Puerto Rican leaders focused on addressing the paucity of both basic material stability and social equality in their communities. Since the 1980s, however, Puerto Ricans have achieved some economic gains, and a growing college-educated middle class has managed to gain more control over the cultural representations of their communities. More recently, the political salience of Puerto Ricans as a group has begun to shift. For the better part of the 20th century, Puerto Ricans in the United States were considered numerically insignificant or politically impotent (or both); but in the last two presidential elections (2008 and 2012), their growing populations in the South, especially in Florida, have drawn attention to their demographic significance and their political sensibilities.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Despite its cultivated reputation as the nation’s “white spot” in the early 20th century, Southern California was in fact home to diverse and numerous communities of color, some composed of relatively new immigrants and some long predating the era of Anglo settlement and conquest. In the years following World War II, the region engaged in suburban home construction on a mass scale and became a global symbol of what Dolores Hayden called the economically democratic but racially exclusive “sitcom suburb,” from the tax-lowering mechanism of its “Lakewood plan” to the car-friendly “Googie” architecture of the San Fernando Valley. Existing suburban communities of color, such as the colonias of agricultural laborers, were engulfed by new settlements, while upwardly mobile African Americans, Latinas/Latinos, and Asian Americans sought access to the expanding suburban dream of homeownership, with varying degrees of success. The political responses to suburban diversity in metropolitan Los Angeles ranged from Anglo resistance and flight to multiracial political coalitions and the incorporation of people of color at multiple levels of local government. The ascent by a number of suburbanites of color to positions of local and regional political power from the 1960s through the 1980s sometimes exposed intra-ethnic discord and sometimes the fragility of cross-race coalition as multiple actors sought to protect property values and to pursue economic security within the competitive constraints of shrinking municipal resources, aging infrastructure, and a receding suburban fringe. As a result, political conflicts over crime, immigration, education, and inequality emerged in many Los Angeles County suburbs by the 1970s and later in the more distant corporate suburbs of Orange, Ventura, Riverside, and San Bernardino Counties. The suburbanization of poverty, the role of suburbs as immigrant gateways, and the emergence of “majority-minority” suburbs—all national trends by the late 1990s and the first decade of the 20th century—were evident far earlier in the Los Angeles metropolitan region, where diverse suburbanites negotiated social and economic crises and innovated political responses.
Rap is the musical practice of hip hop culture that features vocalists, or MCs, reciting lyrics over an instrumental beat that emerged out of the political and economic transformations of New York City after the 1960s. Black and Latinx youth, many of them Caribbean immigrants, created this new cultural form in response to racism, poverty, urban renewal, deindustrialization, and inner-city violence. These new cultural forms eventually spread beyond New York to all regions of the United States as artists from Los Angeles, New Orleans, Miami, and Chicago began releasing rap music with their own distinct sounds. Despite efforts to demonize and censor rap music and hip hop culture, rap music has served as a pathway for social mobility for many black and Latinx youth. Many artists have enjoyed crossover success in acting, advertising, and business. Rap music has also sparked new conversations about various issues such as electoral politics, gender and sexuality, crime, policing, and mass incarceration, as well as technology.
Kyle B. Roberts
From Cahokia to Newport, from Santa Fe to Chicago, cities have long exerted an important influence over the development of American religion; in turn, religion has shaped the life of America’s cities. Early visions of a New Jerusalem quickly gave way to a crowded spiritual marketplace full of faiths competing for the attention of a heterogeneous mass of urban consumers, although the dream of an idealized spiritual city never completely disappeared. Pluralism fostered toleration and freedom of religious choice, but also catalyzed competition and antagonism, sometimes resulting in violence. Struggles over political authority between established and dissenting churches gave way after the American Revolution to a contest over the right to exert moral authority through reform. Secularization, the companion of modernization and urbanization, did not toll the death knell for urban religion, but instead, provided the materials with which the religious engaged the city. Negative discursive constructions of the city proffered by a handful of religious reformers have long cast a shadow over the actual urban experience of most men and women. Historians continue to uncover the rich and innovative ways in which urban religion enabled individuals to understand, navigate, and contribute to the city around them.
Christopher D. Cantwell
Home to more than half the U.S. population by 1920, cities played an important role in the development of American religion throughout the 20th century. At the same time, the beliefs and practices of religious communities also shaped the contours of America’s urban landscape. Much as in the preceding three centuries, the economic development of America’s cities and the social diversity of urban populations animated this interplay. But the explosive, unregulated expansion that defined urban growth after the Civil War was met with an equally dramatic disinvestment from urban spaces throughout the second half of the 20th century. The domestic and European migrations that previously fueled urban growth also changed throughout the century, shifting from Europe and the rural Midwest to the deep South, Africa, Asia, and Latin America after World War II. These newcomers not only brought new faiths to America’s cities but also contributed to the innovation of several new, distinctly urban religious movements. Urban development and diversity on one level promoted toleration and cooperation as religious leaders forged numerous ecumenical and, eventually, interfaith bonds to combat urban problems. But it also led to tension and conflict as religious communities busied themselves with carving out spaces of their own through tight-knit urban enclaves or new suburban locales. Contemporary American cities are some of the most religiously diverse communities in the world. Historians continue to uncover how religious communities not only have lived in but also have shaped the modern city.
The story of the pre-Columbian Mississippi Period (1000
Rock and roll, a popular music craze of the mid-1950s, turned a loud, fast, and sexy set of sounds rooted in urban, black, working class, and southern America into the pop preference as well of suburban, white, young, and northern America. By the late 1960s, those fans and British counterparts made their own version, more politicized and experimental and just called rock—the summoning sound of the counterculture. Rock’s aura soon faded: it became as much entertainment staple as dissident form, with subcategories disparate as singer-songwriter, heavy metal, alternative, and “classic rock.” Where rock and roll was integrated and heterogeneous, rock was largely white and homogeneous, policing its borders. Notoriously, rock fans detonated disco records in 1979. By the 1990s, rock and roll style was hip-hop, with its youth appeal and rebelliousness; post‒baby boomer bands gave rock some last vanguard status; and suburbanites found classic rock in New Country. This century’s notions of rock and roll have blended thoroughly, from genre “mash-ups” to superstar performers almost categories unto themselves and new sounds such as EDM beats. Still, crossover moments evoke rock and roll; assertions of authenticity evoke rock. Because rock and roll, and rock, epitomize cultural ideals and group identities, their definitions have been constantly debated. Initial argument focused on challenging genteel, professional notions of musicianship and behavior. Later discourse took up cultural incorporation and social empowerment, with issues of gender and commercialism as prominent as race and artistry. Rock and roll promised one kind of revolution to the post-1945 United States; rock another. The resulting hope and confusion has never been fully sorted, with mixed consequences for American music and cultural history.
Robert O. Self
Few decades in American history reverberate with as much historical reach or glow as brightly in living mythology as the 1960s. During those years Americans reanimated and reinvented the core political principles of equality and liberty but, in a primal clash that resonates more than half a century later, fiercely contested what those principles meant, and for whom. For years afterward, the decade’s appreciators considered the era to have its own “spirit,” defined by greater freedoms and a deeper, more authentic personhood, and given breath by a youthful generation’s agitation for change in nearly every dimension of national life. To its detractors in subsequent decades, the era was marked by immature radical fantasies and dangerous destabilizations of the social order, behind which lay misguided youthful enthusiasms and an overweening, indulgent federal government. We need not share either conviction to appreciate the long historical shadow cast by the decade’s clashing of left, right, and center and its profound influence over the political debates, cultural logics, and social practices of the many years that followed.
The decade’s political and ideological clashes registered with such force because post–World War II American life was characterized by a society-wide embrace of antiradicalism and a prescribed normalcy. Having emerged from the war as the lone undamaged capitalist industrial power, the United States exerted enormous influence throughout the globe after 1945—so much that some historians have called the postwar years a “pax Americana.” In its own interest and in the interest of its Western allies, the United States engaged in a Cold War standoff with the Soviet Union over the fate of Europe and no less over the fate of developing countries on every continent. Fiercely anticommunist abroad and at home, U.S. elites stoked fears of the damage communism could do, whether in Eastern Europe or in a public school textbook. Americans of all sorts in the postwar years embraced potent ideologies justifying the prevailing order, whether that order was capitalist, patriarchal, racial, or heterosexual. They pursued a postwar “normalcy” defined by nuclear family domesticity and consumer capitalism in the shadow cast by the threat of communism and, after 1949, global thermonuclear war with the Soviet Union. This prevailing order was stultifying and its rupture in the 1960s is the origin point of the decade’s great dramas.
The social movements of that decade drew Americans from the margins of citizenship—African Americans, Latina/o, Native Americans, women, and gay men and lesbians, among others—into epochal struggles over the withheld promise of equality. For the first time since 1861, an American war deeply split the nation, nearly destroying a major political party and intensifying a generational revolt already under way. Violence, including political assassinations at the highest level, bombings and assassinations of African Americans, bombings by left-wing groups like the Weathermen, and major urban uprisings by African Americans against police and property bathed the country in more blood. The New Deal liberalism of Presidents Franklin D. Roosevelt and Harry S. Truman reached its postwar peak in 1965 under President Lyndon Johnson’s Great Society and then retreated amid acrimony and backlash, as a new conservative politics gained traction. All this took place in the context of a “global 1960s,” in which societies in Western and Eastern Europe, Latin America, Africa, and elsewhere experienced similar generational rebellions, quests for meaningful democracy, and disillusionment with American global hegemony. From the first year of the decade to the last, the 1960s were a watershed era that marked the definitive end of a “postwar America” defined by easy Cold War dualities, presumptions of national innocence, and political calcification.
To explain the foregoing, this essay is organized in five sections. First comes a broad overview of the decade, highlighting some of its indelible moments and seminal political events. The next four sections correspond to the four signature historical developments of the 1960s. Discussed first is the collapse of the political consensus that predominated in national life following World War II. We can call this consensus “Vital Center liberalism,” after the title of a 1949 book by Arthur Schlesinger Jr., or “Cold War liberalism.” Its assault from both the New Left and the New Right is one of the defining stories of the 1960s. Second is the resurgence, after a decades-long interregnum dating to Reconstruction, of African American political agency. The black freedom struggle of the 1960s was far more than a social movement for civil rights. To shape the conditions of national life and the content of public debate in ways impossible under Jim Crow, black American called for nothing less than a spiritual and political renewal of the country. Third, and following from the latter, is the emergence within the American liberal tradition of a new emphasis on expanding individual rights and ending invidious discrimination. Forged in conjunction with the black freedom movement by women, Latino/as, Asian Americans, Native Americans, and homophiles (as early gay rights activists were called) and gay liberationists, this new emphasis profoundly changed American law and set the terms of political debate for the next half century. Fourth and lastly, the 1960s witnessed the flourishing of a broad and diverse culture of anti-authoritarianism. In art, politics, and social behavior, this anti-authoritarianism took many forms, but at its heart lay two distinct historical phenomena: an ecstatic celebration of youth, manifest in the tension between the World War II generation and the baby boom generation, and an intensification of the long-standing conflict in American life between individualism and hierarchical order.
Despite the disruptions, rebellions, and challenges to authority in the decade, the political and economic elite proved remarkably resilient and preserved much of the prevailing order. This is not to discount the foregoing account of challenges to that order or to suggest that social change in the 1960s made little difference in American life. However, in grappling with this fascinating decade we are confronted with the paradox of outsized events and enormous transformations in law, ideology, and politics alongside a continuation, even an entrenchment, of traditional economic and political structures and practices.
Ansley T. Erickson
“Urban infrastructure” calls to mind railways, highways, and sewer systems. Yet the school buildings—red brick, limestone, or concrete, low-slung, turreted, or glass-fronted—that hold and seek to shape the city’s children are ubiquitous forms of infrastructure as well. Schools occupy one of the largest line items in a municipal budget, and as many as a fifth of a city’s residents spend the majority of their waking hours in school classrooms, hallways, and gymnasiums. In the 19th and 20th centuries urban educational infrastructure grew, supported by developing consensus for publicly funded and publicly governed schools (if rarely fully accessible to all members of the public). Even before state commitment to other forms of social welfare, from pensions to public health, and infrastructure, from transit to fire, schooling was a government function.
This commitment to public education ultimately was national, but schools in cities had their own story. Schooling in the United States is chiefly a local affair: Constitutional responsibility for education lies with the states; power is then further decentralized as states entrust decisions about school function and funding to school districts. School districts can be as small as a single town or a part of a city. Such localism is one reason that it is possible to speak about schools in U.S. cities as having a particular history, determined as much by the specificities of urban life as by national questions of citizenship, economy, religion, and culture.
While city schools have been distinct, they have also been nationally influential. Urban scale both allowed for and demanded the most extensive educational system-building. Urban growth and diversity galvanized innovation, via exploration in teaching methods, curriculum, and understanding of children and communities. And it generated intense conflict. Throughout U.S. history, urban residents from myriad social, political, religious, and economic positions have struggled to define how schools would operate, for whom, and who would decide.
During the 19th and 20th centuries, U.S. residents struggled over the purposes, funding, and governance of schools in cities shaped by capitalism, nativism, and white supremacy. They built a commitment to schooling as a public function of their cities, with many compromises and exclusions. In the 21st century, old struggles re-emerged in new form, perhaps raising the question of whether schools will continue as public, urban infrastructure.
Since many North American indigenous societies also built and inhabited towns, America was not an entirely rural continent before the arrival of Europeans. Nevertheless, when Europeans set out to colonize their “wilderness,” they arrived with a practical and ideological commitment to recreating cities of the sort with which they were familiar on their home continent. The result of their ambitions was the rapid founding and development of European-style cities, the vast majority of which clustered on large bodies of water, either directly on the Atlantic Ocean or on the seas and river estuaries adjacent to it. The pace of city expansion was closely linked to the levels of support for cities among colonists and an economic environment that stimulated urban growth. Some cities grew faster than others, but by the middle of the 18th century even Virginia and Maryland, the most rural colonies, had towns that played a critical cultural, political, and economic role in society. By the revolutionary era, the centrality of North America’s seaports was cemented by their status as crucibles of the conflict. The issue of which seaport was the new United States’ premier city was contested, but the importance of cities to North American society was no longer debated.
In the seventy years since the end of World War II (1939–1945), postindustrialization—the exodus of manufacturing and growth of finance and services—has radically transformed the economy of North American cities. Metropolitan areas are increasingly home to transnational firms that administer dispersed production networks that span the world. A few major global centers host large banks that coordinate flows of finance capital necessary not only for production, but also increasingly for education, infrastructure, municipal government, housing, and nearly every other aspect of life. In cities of the global north, fewer workers produce goods and more produce information, entertainment, and experiences. Women have steadily entered the paid workforce, where they often do the feminized work of caring for children and the ill, cleaning homes, and preparing meals. Like the Gilded Age city, the postindustrial city creates immense social divisions, injustices, and inequalities: penthouses worth millions and rampant homelessness, fifty-dollar burgers and an epidemic of food insecurity, and unparalleled wealth and long-standing structural unemployment all exist side by side. The key features of the postindustrial service economy are the increased concentration of wealth, the development of a privileged and celebrated workforce of professionals, and an economic system reliant on hyperexploited service workers whose availability is conditioned by race, immigration status, and gender.
Christopher W. Schmidt
One of the most significant protest campaigns of the civil rights era, the lunch counter sit-in movement began on February 1, 1960 when four young African American men sat down at the whites-only lunch counter of the Woolworth store in Greensboro, North Carolina. Refused service, the four college students sat quietly until the store closed. They continued their protest on the following days, each day joined by more fellow students. Students in other southern cities learned what was happening and started their own demonstrations, and in just weeks, lunch counter sit-ins were taking place across the South. By the end of the spring, tens of thousands of black college and high school students, joined in some cases by sympathetic white students, had joined the sit-in movement. Several thousand went to jail for their efforts after being arrested on charges of trespass, disorderly conduct, or whatever other laws southern police officers believed they could use against the protesters.
The sit-ins arrived at a critical juncture in the modern black freedom struggle. The preceding years had brought major breakthroughs, such as the Supreme Court’s Brown v. Board of Education school desegregation ruling in 1954 and the successful Montgomery bus boycott of 1955–1956, but by 1960, activists were struggling to develop next steps. The sit-in movement energized and transformed the struggle for racial equality, moving the leading edge of the movement from the courtrooms and legislative halls to the streets and putting a new, younger generation of activists on the front lines. It gave birth to the Student Nonviolent Coordinating Committee, one of the most important activist groups of the 1960s. It directed the nation’s attention to the problem of racial discrimination in private businesses that served the public, pressured business owners in scores of southern cities to open their lunch counters to African American customers, and set in motion a chain of events that would culminate in the Civil Rights Act of 1964, which banned racial discrimination in public accommodations across the nation.
The tall building—the most popular and conspicuous emblem of the modern American city—stands as an index of economic activity, civic aspirations, and urban development. Enmeshed in the history of American business practices and the maturation of corporate capitalism, the skyscraper is also a cultural icon that performs genuine symbolic functions. Viewed individually or arrayed in a “skyline,” there may be a tendency to focus on the tall building’s spectacular or superlative aspects. Their patrons have searched for the architectural symbols that would project a positive public image, yet the height and massing of skyscrapers were determined as much by prosaic financial calculations as by symbolic pretense. Historically, the production of tall buildings was linked to the broader flux of economic cycles, access to capital, land values, and regulatory frameworks that curbed the self-interests of individual builders in favor of public goods such as light and air. The tall building looms large for urban geographers seeking to chart the shifting terrain of the business district and for social historians of the city who examine the skyscraper’s gendered spaces and labor relations. If tall buildings provide one index of the urban and regional economy, they are also economic activities in and of themselves and thus linked to the growth of professions required to plan, finance, design, construct, market, and manage these mammoth collective objects—and all have vied for control over the ultimate result. Practitioners have debated the tall building’s external expression as the design challenge of the façade became more acute with the advent of the curtain wall attached to a steel frame, eventually dematerializing entirely into sheets of reflective glass. The tall building also reflects prevailing paradigms in urban design, from the retail arcades of 19th-century skyscrapers to the blank plazas of postwar corporate modernism.
The patterns of urban slavery in North American and pre-Civil War US cities reveal the ways in which individual men and women, as well as businesses, institutions, and governmental bodies employed slave labor and readily adapted the system of slavery to their economic needs and desires. Colonial cities east and west of the Mississippi River founded initially as military forts, trading posts, and maritime ports, relied on African and Native American slave labor from their beginnings. The importance of slave labor increased in Anglo-American East Coast urban settings in the 18th century as the number of enslaved Africans increased in these colonies, particularly in response to the growth of the tobacco, wheat, and rice industries in the southern colonies. The focus on African slavery led most Anglo-American colonies to outlaw the enslavement of Native Americans, and urban slavery on the East Coast became associated almost solely with people of African descent. In addition, these cities became central nodes in the circum-Atlantic transportation and sale of enslaved people, slave-produced goods, and provisions for slave colonies whose economies centered on plantation goods. West of the Mississippi, urban enslavement of Native Americans, Mexicans, and even a few Europeans continued through the 19th century.
As the thirteen British colonies transitioned to the United States during and after the Revolutionary War, three different directions emerged regarding the status of slavery, which would affect the status of slavery and people of African descent in cities. The gradual emancipation of enslaved people in states north of Delaware led to the creation of the so-called free states, with large numbers of free blacks moving into cities to take full advantage of freedom and the possibility of creating family and community. Although antebellum northern cities were located within areas where legalized slavery ended, these cities retained economic and political ties to southern slavery. At the same time, the radical antislavery movement developed in Philadelphia, Boston, and New York. Thus, Northern cities were the site of political conflicts between pro- and antislavery forces. In the Chesapeake, as the tobacco economy declined, slave owners manumitted enslaved blacks for whom they did not have enough work, creating large groups of free blacks in cities. But these states began to participate heavily in the domestic slave trade, with important businesses located in cities. And in the Deep South, the recommitment to slavery following the Louisiana Purchase and the emergence of the cotton economy led to the creation of a string of wealthy port cities critical to the transportation of slaves and goods. These cities were situated in local economic geographies that connected rural plantations to urban settings and in national and international economies of exchange of raw and finished goods that fueled industries throughout the Atlantic world. The vast majority of enslaved people employed in the antebellum South worked on rural farms, but slave labor was a key part of the labor force in southern cities. Only after the Civil War did slavery and cities become separate in the minds of Americans, as postwar whites north and south created a mythical South in which romanticized antebellum cotton plantations became the primary symbol of American slavery, regardless of the long history of slavery that preceded their existence.
During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.
Peter C. Baldwin
Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.
Conceptions of what constitutes a street gang or a youth gang have varied since the seminal sociological studies on these entities in the 1920s. Organizations of teenage youths and young adults in their twenties, congregating in public spaces and acting collectively, were fixtures of everyday life in American cities throughout the 20th century. While few studies historicize gangs in their own right, historians in a range of subfields cast gangs as key actors in critical dimensions of the American urban experience: the formation and defense of ethno-racial identities and communities; the creation and maintenance of segregated metropolitan spaces; the shaping of gender norms and forms of sociability in working-class districts; the structuring of contentious political mobilization challenging police practices and municipal policies; the evolution of underground and informal economies and organized crime activities; and the epidemic of gun violence that spread through minority communities in many major cities at the end of the 20th and beginning of the 21st centuries.
Although groups of white youths patrolling the streets of working-class neighborhoods and engaging in acts of defensive localism were commonplace in the urban Northeast, Mid-Atlantic, and Midwest states by the mid-19th century, street gangs exploded onto the urban landscape in the early 20th century as a consequence of massive demographic changes related to the wave of immigration from Europe, Asia, and Latin America and the migration of African Americans from the South. As immigrants and migrants moved into urban working-class neighborhoods and industrial workplaces, street gangs proliferated at the boundaries of ethno-racially defined communities, shaping the context within which immigrant and second-generation youths negotiated Americanization and learned the meanings of race and ethnicity. Although social workers in some cities noted the appearance of some female gangs by the 1930s, the milieu of youth gangs during this era was male dominated, and codes of honor and masculinity were often at stake in increasingly violent clashes over territory and resources like parks and beaches.
The interplay of race, ethnicity, and masculinity continued to shape the world of gangs in the 1940s and 1950s, when white male gangs claiming to defend the whiteness of their communities used terror tactics to reinforce the boundaries of ghettos and barrios in many cities. Such aggressions spurred the formation of fighting gangs in black and Latino neighborhoods, where youths entered into at times deadly combat against their aggressors but also fought for honor, respect, and status with rivals within their communities. In the 1960s and 1970s, with civil rights struggles and ideologies of racial empowerment circulating through minority neighborhoods, some of these same gangs, often with the support of community organizers affiliated with political organizations like the Black Panther Party, turned toward defending the rights of their communities and participating in contentious politics. However, such projects were cut short by the fierce repression of gangs in minority communities by local police forces, working at times in collaboration with the Federal Bureau of Investigation. By the mid-1970s, following the withdrawal of the Black Panthers and other mediating organizations from cities like Chicago and Los Angeles, so-called “super-gangs” claiming the allegiance of thousands of youths began federating into opposing camps—“People” against “Folks” in Chicago, “Crips” against “Bloods” in LA—to wage war for control of emerging drug markets. In the 1980s and 1990s, with minority communities dealing with high unemployment, cutbacks in social services, failing schools, hyperincarceration, drug trafficking, gun violence, and toxic relations with increasingly militarized police forces waging local “wars” against drugs and gangs, gangs proliferated in cities throughout the urban Sun Belt. Their prominence within popular and political discourse nationwide made them symbols of the urban crisis and of the cultural deficiencies that some believed had caused it.
Ann Durkin Keating
Since the beginning of the 19th century, outlying areas of American cities have been home to a variety of settlements and enterprises with close links to urban centers. Beginning in the early 19th century, the increasing scale of business and industrial enterprises separated workplaces from residences. This allowed some urban dwellers to live at a distance from their place of employment and commute to work. Others lived in the shadow of factories located at some distance from the city center. Still others provided food or raw materials for urban residents and businesses. The availability of employment led to further suburban growth. Changing intracity transportation, including railroads, interurbans, streetcars, and cable cars, enabled people and businesses to locate beyond the limits of a walking city.
By the late 19th century, metropolitan areas across the United States included outlying farm centers, industrial towns, residential rail (or streetcar) suburbs, and recreational/institutional centers. With suburbs generally located along rail or ferry lines into the early 20th century, the physical development of metropolitan areas often resembled a hub and spokes. However, across metropolitan regions, suburbs had a great range of function and diversity of populations. With the advent of automobile commutation and the growing use of trucks to haul freight, suburban development took place between railroad lines, filling in the earlier hub-and-spokes patterns into a more deliberate built-up area.
Although suburban settlements were integrally connected to their neighbors and within a metropolitan economy and society, independent suburban governments emerged to serve these outlying settlements and keep them separate. Developers often took the lead in providing differential services (and regulations). Suburban governments emerged as hybrid forms, serving relatively homogeneous populations by providing only some urban functions. Well before 1945, suburbs were home to a wide range of work and residents.
Becky Nicolaides and Andrew Wiese
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.