Christopher R. Reed
The unanticipated and massive migration of half a million African Americans between 1916 and 1918 from the racially oppressive South to the welcoming North surprised the nation. Directly resulting from the advent of the First World War, the movement of these able-bodied workers provided essential labor to maintain wartime production that sustained the Allied war effort. One-tenth of the people who surged north headed to and remained in Chicago, where their presence challenged the status quo in the areas of employment, external race relations, internal race arrangements, politics, housing, and recreation. Once in the Windy City, this migrant-influenced labor pool expanded with the addition of resident blacks to form the city’s first African American industrial proletariat. Wages for both men and women increased compared to what they had been earning in the South, and local businesses were ready and willing to accommodate these new consumers. A small black business sector became viable and was able to support two banks, and by the mid-1920s, there were multiple stores along Chicago’s State Street forming a virtual “Black Wall Street.” An extant political submachine within Republic Party ranks also increased its power and influence in repeated electoral contests. Importantly, upon scrutiny, the purported social conflict between the Old Settler element and the newcomers was shown to be overblown and inconsequential to black progress.
Recent revisionist scholarship over the past two decades has served to minimize the first phase of northward movement and has positioned it within the context of a half-century phenomenon under the labels of the “Second Great Migration” and the “Great Black Migration.” No matter what the designation, the voluntary movement of five to six million blacks from what had been their traditional home to the uncertainty of the North and West between the First World War and the Vietnam conflict stands as both a condemnation of regional oppression of the human spirit and aspirations of millions, and a demonstration of group courage in taking on new challenges in new settings. Although Chicago would prove to be “no crystal stair,” it was on many occasions a land of hope and promise for migrants throughout the past century.
During the 20th century, the black population of the United States transitioned from largely rural to mostly urban. In the early 1900s the majority of African Americans lived in rural, agricultural areas. Depictions of black people in popular culture often focused on pastoral settings, like the cotton fields of the rural South. But a dramatic shift occurred during the Great Migrations (1914–1930 and 1941–1970) when millions of rural black southerners relocated to US cities.
Motivated by economic opportunities in urban industrial areas during World Wars I and II, African Americans opted to move to southern cities as well as to urban centers in the Northeast, Midwest, and West Coast. New communities emerged that contained black social and cultural institutions, and musical and literary expressions flourished. Black migrants who left the South exercised voting rights, sending the first black representatives to Congress in the 20th century. Migrants often referred to themselves as “New Negroes,” pointing to their social, political, and cultural achievements, as well as their use of armed self-defense during violent racial confrontations, as evidence of their new stance on race.
The Immigration Act of 1924 was in large part the result of a deep political and cultural divide in America between heavily immigrant cities and far less diverse small towns and rural areas. The 1924 legislation, together with growing residential segregation, midcentury federal urban policy, and postwar suburbanization, undermined scores of ethnic enclaves in American cities between 1925 and the 1960s. The deportation of Mexicans and their American children during the Great Depression, the incarceration of West Coast Japanese Americans during World War II, and the wartime and postwar shift of so many jobs to suburban and Sunbelt areas also reshaped many US cities in these years. The Immigration Act of 1965, which enabled the immigration of large numbers of people from Asia, Latin America, and, eventually, Africa, helped to revitalize many depressed urban areas and inner-ring suburbs. In cities and suburbs across the country, the response to the new immigration since 1965 has ranged from welcoming to hostile. The national debate over immigration in the early 21st century reflects both familiar and newer cultural, linguistic, religious, racial, and regional rifts. However, urban areas with a history of immigrant incorporation remain the most politically supportive of such people, just as they were a century ago.
Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.
The eighty years from 1790 to 1870 were marked by dramatic economic and demographic changes in the United States. Cities in this period grew faster than the country as a whole, drawing migrants from the countryside and immigrants from overseas. This dynamism stemmed from cities’ roles as spearheads of commercial change and sites of new forms of production. Internal improvements such as canals and railroads expanded urban hinterlands in the early republic, while urban institutions such as banks facilitated market exchange. Both of these worked to the advantage of urban manufacturers. By paying low wages to workers performing repetitive tasks, manufacturers enlarged the market for their products but also engendered opposition from a workforce internally divided along lines of sex and race, and at times slavery and freedom. The Civil War affirmed the legitimacy of wage labor and enhanced the power of corporations, setting the stage for the postwar growth of large-scale, mechanized industry.
Mass transit has been part of the urban scene in the United States since the early 19th century. Regular steam ferry service began in New York City in the early 1810s and horse-drawn omnibuses plied city streets starting in the late 1820s. Expanding networks of horse railways emerged by the mid-19th century. The electric streetcar became the dominant mass transit vehicle a half century later. During this era, mass transit had a significant impact on American urban development. Mass transit’s importance in the lives of most Americans started to decline with the growth of automobile ownership in the 1920s, except for a temporary rise in transit ridership during World War II. In the 1960s, congressional subsidies began to reinvigorate mass transit and heavy-rail systems opened in several cities, followed by light rail systems in several others in the next decades. Today concerns about environmental sustainability and urban revitalization have stimulated renewed interest in the benefits of mass transit.
By serving travelers and commerce, roads and streets unite people and foster economic growth. But as they develop, roads and streets also disrupt old patterns, upset balances of power, and isolate some as they serve others. The consequent disagreements leave historical records documenting social struggles that might otherwise be overlooked. For long-distance travel in America before the middle of the 20th century, roads were generally poor alternatives, resorted to when superior means of travel, such as river and coastal vessels, canal boats, or railroads were unavailable. Most roads were unpaved, unmarked, and vulnerable to the effects of weather. Before the railroads, for travelers willing to pay the toll, rare turnpikes and plank roads could be much better. Even in towns, unpaved streets were common until the late 19th century, and persisted into the 20th. In the late 19th century, rapid urban growth, rural free delivery of the mails, and finally the proliferation of electric railways and bicycling contributed to growing pressure for better roads and streets. After 1910, the spread of the automobile accelerated the trend, but only with great controversy, especially in cities. Partly in response to the controversy, advocates of the automobile organized to promote state and county motor highways funded substantially by gasoline taxes; such roads were intended primarily for motor vehicles. In the 1950s, massive federal funds accelerated the trend; by then, motor vehicles were the primary transportation mode for both long and short distances. The consequences have been controversial, and alternatives have been attracting growing interest.
Joel A. Tarr
Urban water supply and sewage disposal facilities are critical parts of the urban infrastructure. They have enabled cities and their metropolitan areas to function as centers of commerce, industry, entertainment, and human habitation. The evolution of water supply and sewage disposal systems in American cities from 1800 to 2015 is examined, with a focus on major turning points especially in regard to technological decisions, public policy, and environmental and public health issues.
Racism and xenophobia, but also resilience and community building, characterize the return of thousands of Japanese Americans, or Nikkei, to the West Coast after World War II. Although the specific histories of different regions shaped the resettlement experiences for Japanese Americans, Los Angeles provides an instructive case study. For generations, the City of Angels has been home to one of the nation’s largest and most diverse Nikkei communities and the ways in which Japanese Americans rebuilt their lives and institutions resonate with the resettlement experience elsewhere.
Before World War II, greater Los Angeles was home to a vibrant Japanese American population. First generation immigrants, or Issei, and their American-born children, the Nisei, forged dynamic social, economic, cultural, and spiritual institutions out of various racial exclusions. World War II uprooted the community as Japanese Americans left behind their farms, businesses, and homes. In the best instances, they were able to entrust their property to neighbors or other sympathetic individuals. More often, the uncertainty of their future led Japanese Americans to sell off their property, far below the market price. Upon the war’s end, thousands of Japanese Americans returned to Los Angeles, often to financial ruin.
Upon their arrival in the Los Angeles area, Japanese Americans continued to face deep-seated prejudice, all the more accentuated by an overall dearth of housing. Without a place to live, they sought refuge in communal hostels set up in pre-war institutions that survived the war such as a variety of Christian and Buddhist churches. Meanwhile, others found housing in temporary trailer camps set up by the War Relocation Authority (WRA), and later administered by the Federal Public Housing Authority (FPHA), in areas such as Burbank, Sun Valley, Hawthorne, Santa Monica, and Long Beach. Although some local religious groups and others welcomed the returnees, white homeowners, who viewed the settlement of Japanese Americans as a threat to their property values, often mobilized to protest the construction of these camps. The last of these camps closed in 1956, demonstrating the hardship some Japanese Americans still faced in integrating back into society. Even when the returnees were able to leave the camps, they still faced racially restrictive housing covenants and, when those practices were ruled unconstitutional, exclusionary lending. Although new suburban enclaves of Japanese Americans eventually developed in areas such as Gardena, West Los Angeles, and Pacoima by the 1960s, the pathway to those destinations was far from easy. Ultimately, the resettlement of Japanese Americans in Los Angeles after their mass incarceration during World War II took place within the intertwined contexts of lingering anti-Japanese racism, Cold War politics, and the suburbanization of Southern California.
Chrissy Yee Lau
Gambling was a central facet of life for Japanese male laborers in early 20th-century California. From the late 19th to the early 20th century, labor contractors and Chinese gambling dens offered gambling to Japanese laborers to maintain a consistent cheap labor force and large consumer pool. Many laborers approached gambling as a form of leisure, an opportunity for getting rich quickly and building a sense of community. After the Gentlemen’s Agreement was passed in 1907–1908, Japanese elites led anti-gambling campaigns aimed at Chinese gambling dens in their larger project to build the empire abroad and acquire domestic civil rights. By the 1920s, Japanese-run gambling dens became more established, but the hardships of Japanese immigrant wives prompted collaboration with the Japanese Associations of America to address gambling among married men. The larger community memory around gambling is often told from the wife or children’s perspective, recounted with pain and suffering over how gambling tore families asunder.
Many Asian American neighborhoods faced displacement after World War II because of urban renewal or redevelopment under the 1949 Housing Act. In the name of blight removal and slum clearance this Act allowed local elites to procure federal money to seize land designated as blighted, clear it of its structures, and sell this land to private developers—in the process displacing thousands of residents, small businesses, and community institutions. San Francisco’s Fillmore District, a multiracial neighborhood that housed the city’s largest Japanese American and African American communities, experienced this postwar redevelopment. Like many Asian American neighborhoods that shared space with other communities of color, the Fillmore formed at the intersection of class inequality and racism, and it was this intersection of structural factors that led to substandard urban conditions. Rather than recognize the root causes of urban decline, San Francisco urban and regional elites argued that the Fillmore was among the city’s most blighted neighborhoods and advocated for the neighborhood’s destruction in the name of the public good. They also targeted the Fillmore because their postwar plans for remaking the city’s political economy envisioned the Fillmore as (1) a space to house white- collar workers in the postwar economy and (2) as an Asian-themed space for tourism that connected the city symbolically and economically to Japan, an important U.S. postwar ally. For over four decades these elite-directed plans for the Fillmore displaced more than 20,000 residents in two phases, severely damaging the community. The Fillmore’s redevelopment, then, provides a window into other cases of redevelopment and aids further investigations of the connection between Asian Americans and urban crisis. It also sheds light on the deeper history of displacement in the Asian American experience and contextualizes contemporary gentrification in Asian American neighborhoods.
In January 1938, Benny Goodman took command of Carnegie Hall on a blustery New York City evening and for two hours his band tore through the history of jazz in a performance that came to define the entire Swing Era. Goodman played Carnegie Hall at the top of his jazz game leading his crack band—including Gene Krupa on drums and Harry James on trumpet—through new, original arrangements by Fletcher Henderson. Compounding the historic nature of the highly publicized jazz concert, Goodman welcomed onto the stage members of Duke Ellington’s band to join in on what would be the first major jazz performance by an integrated band. With its sprit of inclusion as well as its emphasis on the historical contours of the first decades of jazz, Goodman’s Carnegie Hall concert represented the apex of jazz music’s acceptance as the most popular form of American musical expression. In addition, Goodman’s concert coincided with the resurgence of the record industry, hit hard by the Great Depression. By the late 1930s, millions of Americans purchased swing records and tuned into jazz radio programs, including Goodman’s own show, which averaged two million listeners during that period.
And yet, only forty years separated this major popular triumph and the very origins of jazz music. Between 1900 and 1945, American musical culture changed dramatically; new sounds via new technologies came to define the national experience. At the same time, there were massive demographic shifts as black southerners moved to the Midwest and North, and urban culture eclipsed rural life as the norm. America in 1900 was mainly a rural and disconnected nation, defined by regional identities where cultural forms were transmitted through live performances. By the end of World War II, however, a definable national musical culture had emerged, as radio came to link Americans across time and space. Regional cultures blurred as a national culture emerged via radio transmissions, motion picture releases, and phonograph records. The turbulent decade of the 1920s sat at the center of this musical and cultural transformation as American life underwent dramatic changes in the first decades of the 20th century.
In the post-1945 period, jazz moved rapidly from one major avant-garde revolution (the birth of bebop) to another (the emergence of free jazz) while developing a profusion of subgenres (hard bop, progressive, modal, Third Stream, soul jazz) and a new idiomatic persona (cool or hip) that originated as a form of African American resistance but soon became a signature of transgression and authenticity across the modern arts and culture. Jazz’s long-standing affiliation with African American urban life and culture intensified through its central role in the Black Arts Movement of the 1960s. By the 1970s, jazz, now fully eclipsed in popular culture by rock n’ roll, turned to electric instruments and fractured into a multitude of hyphenated styles (jazz-funk, jazz-rock, fusion, Latin jazz). The move away from acoustic performance and traditional codes of blues and swing musicianship generated a neoclassical reaction in the 1980s that coincided with a mission to establish an orthodox jazz canon and honor the music’s history in elite cultural institutions. Post-1980s jazz has been characterized by tension between tradition and innovation, earnest preservation and intrepid exploration, Americanism and internationalism.
David S. Tanenhaus
Juvenile justice is a technical term that refers to the specific area of law and affiliated institutions, most notably the juvenile court, with jurisdiction over the cases of minors who are accused of being miscreants. Although the idea that the law should treat minors differently from adults predates the American Revolution, juvenile justice itself is a Progressive Era invention. Its institutional legitimacy rests on the power and responsibility of the state to act as a parent (parens patriae) on behalf of those who cannot care for themselves. Since the establishment of the world’s first juvenile court in Chicago in 1899, this American idea of creating separate justice systems for juveniles has spread across the nation and much of the world. For more than a century, American states have used their juvenile justice systems to respond to youth crime and delinquency. Since the 1960s, the US Supreme Court has periodically considered whether juvenile courts must provide the same constitutional due process safeguards as adult criminal courts and whether juveniles prosecuted in the criminal justice system can receive the same sentences as adults, such as the death penalty or life without the possibility of parole.
A. K. Sandoval-Strausz
“Latino urbanism” describes a culturally specific set of spatial forms and practices created by people of Hispanic origin. It includes many different aspects of those forms and practices, including town planning; domestic, religious, and civic architecture; the adaptation of existing residential, commercial, and other structures; and the everyday use of spaces such as yards, sidewalks, storefronts, streets, and parks.
Latino urbanism has developed over both time and space. It is the evolving product of half a millennium of colonization, settlement, international and domestic migration, and globalization. It has spanned a wide geographic range, beginning in the southern half of North America and gradually expanding to much of the hemisphere.
There have been many variations on Latino urbanism, but most include certain key features: shared central places where people show their sense of community, a walking culture that encourages face-to-face interaction with neighbors, and a sense that sociability should take place as much in the public realm as in the privacy of the home. More recently, planners and architects have realized that Latino urbanism offers solutions to problems such as sprawl, social isolation, and environmental unsustainability.
The term “urbanism” connotes city spaces, and Latino urbanism is most concentrated and most apparent at the center of metropolitan areas. At the same time, it has also been manifested in a wide variety of places and at different scales, from small religious altars in private homes; to Spanish-dominant commercial streetscapes in Latino neighborhoods; and ultimately to settlement patterns that reach from the densely packed centers of cities to the diversifying suburbs that surround them, out to the agricultural hinterlands at their far peripheries—and across borders to big cities and small pueblos elsewhere in the Americas.
Emily K. Hobson
Since World War II, the United States has witnessed major changes in lesbian, gay, bisexual, transgender, and queer (LGBTQ) politics. Indeed, because the history of LGBTQ activism is almost entirely concentrated in the postwar years, the LGBTQ movement is typically said to have achieved rapid change in a short period of time. But if popular accounts characterize LGBTQ history as a straightforward narrative of progress, the reality is more complex. Postwar LGBTQ politics has been both diverse and divided, marked by differences of identity and ideology. At the same time, LGBTQ politics has been embedded in the contexts of state-building and the Cold War, the New Left and the New Right, the growth of neoliberalism, and the HIV/AIDS epidemic. As the field of LGBTQ history has grown, scholars have increasingly been able to place analyses of state regulation into conversation with community-based histories. Moving between such outside and inside perspectives helps to reveal how multiple modes of LGBTQ politics have shaped one another and how they have been interwoven with broader social change. Looking from the outside, it is apparent that LGBTQ politics has been catalyzed by exclusions from citizenship; from the inside, we can see that activists have responded to such exclusions in different ways, including both by seeking social inclusion and by rejecting assimilationist terms. Court rulings and the administration of law have run alongside the debates inside activist communities. Competing visions for LGBTQ politics have centered around both leftist and liberal agendas, as well as viewpoints shaped by race, gender, gender expression, and class.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
Urban renewal refers to an interlocking set of national and local policies, programs, and projects, implemented in the vast majority of American cities between 1949 and 1973. These typically entailed major redevelopment of existing urban areas with a view to the modernization of housing, highway infrastructure, commercial and business districts, as well as other large-scale constructions. Reformers from the Progressive Era through the Great Society strove to ameliorate the conditions of poverty and inequality in American cities by focusing primarily on physical transformation of the urban built environment. Citing antecedents such as the reconstruction of Second Empire Paris, imported via the City Beautiful movement, and then updated with midcentury modernism, US urban planners envisioned a radical reorganization of city life. In practice, federal programs and local public authorities targeted the eradication of areas deemed slums or blighted—often as much to socially sanitize neighborhoods inhabited by racial minorities and other marginalized groups as to address deteriorating physical conditions. And while federal funding became available for public works projects in declining central cities under the auspices of improving living conditions for the poor—including providing public housing—urban renewal programs consistently destroyed more affordable housing than they created, over more than three decades. By the end of the 1960s, urban residents and policymakers across the political spectrum concluded that such programs were usually doing more harm than good, and most ended during the Nixon administration. Yet large-scale reminders of urban renewal can still be found in most large US communities, whether in the form of mid-20th-century public housing blocks, transportation projects, stadiums, convention centers, university and hospital expansions, or a variety of public-private redevelopment initiatives. But perhaps the most fundamental legacies of all were the institutionalization of the comprehensive zoning and master planning process in cities nationwide, on the one hand, and the countervailing mobilization of defensively oriented (NIMBY) neighborhood politics, on the other.
Nicolas G. Rosenthal
An important relationship has existed between Native Americans and cities from pre-Columbian times to the early 21st century. Long before Europeans arrived in the Americas, indigenous peoples developed societies characterized by dense populations, large-scale agriculture, monumental architecture, and complex social hierarchies. Following European and American conquest and colonization, Native Americans played a crucial role in the development of towns and cities throughout North America, often on the site of former indigenous settlements.
Beginning in the early 20th century, Native Americans began migrating from reservations to U.S. cities in large numbers and formed new intertribal communities. By 1970, the majority of the Native American population lived in cities and the numbers of urban American Indians have been growing ever since. Indian Country in the early 21st century continues to be influenced by the complex and evolving ties between Native Americans and cities.
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”