Shelley Sang-Hee Lee
Although the 1992 Los Angeles riots have been described as a “race riot” sparked by the acquittals of a group of mostly white police officers charged with excessively beating black motorist Rodney King, the widespread targeting and destruction of Asian-owned (mainly Korean) property in and around South Central Los Angeles stands out as one of the most striking aspects of the uprising. For all the commentary generated about the state of black-white relations, African American youths, and the decline of America’s inner cities, the riots also gave many Americans their first awareness of the presence of a Korean immigrant population in Southern California, a large number of Korean shop owners, and the existence of what was commonly framed as the “black-Korean conflict.” For Korean Americans, and Asian Americans more generally, the Los Angeles riots represented a shattered “American dream” and brought focus to their tenuous hold on economic mobility and social inclusion in a society fraught by racial and ethnic tension. The riots furthermore marked a turning point that placed Asian immigrants and Asian Americans at the center of new conversations about social relations in a multiracial America, the place of new immigrants, and the responsibilities of relatively privileged minorities toward the less privileged.
John D. Fairfield
The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.
Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.
Chloe E. Taft
The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.
The use of illicit drugs in US cities led to the development of important subcultures with shared practices, codes, discourses, and values. From the 19th century onward, American city dwellers have indulged in opiates, cocaine, amphetamines, cannabis, lysergic acid diethylamide (LSD), crack, and 3,4-Methylenedioxymethamphetamine (also known as MDMA or ecstasy). The population density of metropolitan America contributed to the spread of substance use and the rise of communities that centered their lives on drug consumption. In the history of urban drug use, opiates have outlasted all the other drugs and have naturally attracted the bulk of scholarly attention.
The nature and identity of these illicit subcultures usually depended on the pharmacology of the drugs and the setting in which they were used. Addictive substances like heroin and amphetamines certainly led to the rise of crime in certain urban areas, but by the same token many urban Americans managed to integrate their addiction into their everyday lives. The more complex pharmacology of psychedelic drugs like LSD in turn gave birth to rich subcultures that resist easy classifications. Most drugs began their careers as medical marvels that were accepted as the product of modernity and often used by the middle class or medical practitioners. Race, age, and class prejudice, and the association of drugs with visible subcultures perceived to pose a threat to the moral fabric of society can partly explain their subsequent bans.
Ethnicity is a concept employed to understand the social, cultural, and political processes whereby immigrants and their children cease to be “foreign” and yet retain practices and networks that connect them, at least imaginatively, with places of origin. From an early juncture in American history, ethnic neighborhoods were an important part of such processes. Magnets for new arrivals, city neighborhoods both emerged from and reinforced connections among people of common origins. Among the first notable immigrant neighborhoods in American cities were those composed of people from the German-speaking states of Europe. In the second half of the 19th century, American cities grew rapidly and millions of immigrants arrived to the country from a wider array of origins; neighborhoods such as the New York’s Jewish Lower East Side and San Francisco’s Chinatown supported dense and institutionally complex ethnic networks. In the middle decades of the 20th century, immigration waned as a result of legislative restriction, economic depression, and war. Many former immigrant neighborhoods emptied of residents as cities divided along racial lines and “white ethnics” dispersed to the suburbs. However, some ethnic enclaves endured, while others emerged after the resumption of mass immigration in the 1960s. By the turn of the 21st century ethnic neighborhoods were once again an important facet of American urban life, although they took new forms within the reconfigured geography and economy of a suburbanized nation.
Cindy R. Lobel
Over the course of the 19th century, American cities developed from small seaports and trading posts to large metropolises. Not surprisingly, foodways and other areas of daily life changed accordingly. In 1800, the dietary habits of urban Americans were similar to those of the colonial period. Food provisioning was very local. Farmers, hunters, fishermen, and dairymen from a few miles away brought food by rowboats and ferryboats and by horse carts to centralized public markets within established cities. Dietary options were seasonal as well as regional. Few public dining options existed outside of taverns, which offered lodging as well as food. Most Americans, even in urban areas, ate their meals at home, which in many cases were attached to their workshops, countinghouses, and offices.
These patterns changed significantly over the course of the19th century, thanks largely to demographic changes and technological developments. By the turn of the 20th century, urban Americans relied on a food-supply system that was highly centralized and in the throes of industrialization. Cities developed complex restaurant sectors, and majority immigrant populations dramatically shaped and reshaped cosmopolitan food cultures. Furthermore, with growing populations, lax regulation, and corrupt political practices in many cities, issues arose periodically concerning the safety of the food supply. In sum, the roots of today’s urban food systems were laid down over the course of the 19th century.
Changing foodways, the consumption and production of food, access to food, and debates over food shaped the nature of American cities in the 20th century. As American cities transformed from centers of industrialization at the start of the century to post-industrial societies at the end of the 20th century, food cultures in urban America shifted in response to the ever-changing urban environment. Cities remained centers of food culture, diversity, and food reform despite these shifts.
Growing populations and waves of immigration changed the nature of food cultures throughout the United States in the 20th century. These changes were significant, all contributing to an evolving sense of American food culture. For urban denizens, however, food choice and availability were dictated and shaped by a variety of powerful social factors, including class, race, ethnicity, gender, and laboring status. While cities possessed an abundance of food in a variety of locations to consume food, fresh food often remained difficult for the urban poor to obtain as the 20th century ended.
As markets expanded from 1900 to 1950, regional geography became a less important factor in determining what types of foods were available. In the second half of the 20th century, even global geography became less important to food choices. Citrus fruit from the West Coast was readily available in northeastern markets near the start of the century, and off-season fruits and vegetables from South America filled shelves in grocery stores by the end of the 20th century. Urban Americans became further disconnected from their food sources, but this dislocation spurred counter-movements that embraced ideas of local, seasonal foods and a rethinking of the city’s relationship with its food sources.
While American gambling has a historical association with the lawlessness of the frontier and with the wasteful leisure practices of Southern planters, it was in large cities where American gambling first flourished as a form of mass leisure, and as a commercial enterprise of significant scale. In the urban areas of the Mid-Atlantic, the Northeast, and the upper Mid-West, for the better part of two centuries the gambling economy was deeply intertwined with municipal politics and governance, the practices of betting were a prominent feature of social life, and controversies over the presence of gambling both legal and illegal, were at the center of public debate. In New York and Chicago in particular, but also in Cleveland, Pittsburgh, Detroit, Baltimore, and Philadelphia, gambling channeled money to municipal police forces and sustained machine politics. In the eyes of reformers, gambling corrupted governance and corroded social and economic interactions. Big city gambling has changed over time, often in a manner reflecting important historical processes and transformations in economics, politics, and demographics. Yet irrespective of such change, from the onset of Northern urbanization during the 19th century, through much of the 20th century, gambling held steady as a central feature of city life and politics. From the poolrooms where recently arrived Irish New Yorkers bet on horseracing after the Civil War, to the corner stores where black and Puerto Rican New Yorkers bet on the numbers game in the 1960s, the gambling activity that covered the urban landscape produced argument and controversy, particularly with respect to drawing the line between crime and leisure, and over the question of where and to what ends the money of the gambling public should be directed.
During the 20th century, the black population of the United States transitioned from largely rural to mostly urban. In the early 1900s the majority of African Americans lived in rural, agricultural areas. Depictions of black people in popular culture often focused on pastoral settings, like the cotton fields of the rural South. But a dramatic shift occurred during the Great Migrations (1914–1930 and 1941–1970) when millions of rural black southerners relocated to US cities.
Motivated by economic opportunities in urban industrial areas during World Wars I and II, African Americans opted to move to southern cities as well as to urban centers in the Northeast, Midwest, and West Coast. New communities emerged that contained black social and cultural institutions, and musical and literary expressions flourished. Black migrants who left the South exercised voting rights, sending the first black representatives to Congress in the 20th century. Migrants often referred to themselves as “New Negroes,” pointing to their social, political, and cultural achievements, as well as their use of armed self-defense during violent racial confrontations, as evidence of their new stance on race.
The Immigration Act of 1924 was in large part the result of a deep political and cultural divide in America between heavily immigrant cities and far less diverse small towns and rural areas. The 1924 legislation, together with growing residential segregation, midcentury federal urban policy, and postwar suburbanization, undermined scores of ethnic enclaves in American cities between 1925 and the 1960s. The deportation of Mexicans and their American children during the Great Depression, the incarceration of West Coast Japanese Americans during World War II, and the wartime and postwar shift of so many jobs to suburban and Sunbelt areas also reshaped many US cities in these years. The Immigration Act of 1965, which enabled the immigration of large numbers of people from Asia, Latin America, and, eventually, Africa, helped to revitalize many depressed urban areas and inner-ring suburbs. In cities and suburbs across the country, the response to the new immigration since 1965 has ranged from welcoming to hostile. The national debate over immigration in the early 21st century reflects both familiar and newer cultural, linguistic, religious, racial, and regional rifts. However, urban areas with a history of immigrant incorporation remain the most politically supportive of such people, just as they were a century ago.
Between the 1790s and the 1990s, the Irish American population grew from some 500,000 to nearly 40 million. Part of this growth was due to immigration, especially in the years of the Great Irish Famine, though significant emigration from Ireland both preceded and followed the famine decade of 1846–1855. For much of this 200-year period, Irish-born men and women and their descendants were heavily concentrated in working-class occupations and urban communities. Especially in the years around the opening of the 20th century, Irish Catholic immigrants and their descendants put a distinctive stamp on both the American labor movement and urban working-class culture and politics as a whole. Their outsized influence diminished somewhat over the course of the 20th century, but the American Irish continued to occupy key leadership positions in the U.S. labor movement, the Democratic Party, and the American Catholic Church, even as the working-class members or constituents of these institutions became increasingly ethnically diverse. The experience of Irish American working people thus constitutes an important dimension of a larger story—that of the American working class as a whole.
In January 1938, Benny Goodman took command of Carnegie Hall on a blustery New York City evening and for two hours his band tore through the history of jazz in a performance that came to define the entire Swing Era. Goodman played Carnegie Hall at the top of his jazz game leading his crack band—including Gene Krupa on drums and Harry James on trumpet—through new, original arrangements by Fletcher Henderson. Compounding the historic nature of the highly publicized jazz concert, Goodman welcomed onto the stage members of Duke Ellington’s band to join in on what would be the first major jazz performance by an integrated band. With its sprit of inclusion as well as its emphasis on the historical contours of the first decades of jazz, Goodman’s Carnegie Hall concert represented the apex of jazz music’s acceptance as the most popular form of American musical expression. In addition, Goodman’s concert coincided with the resurgence of the record industry, hit hard by the Great Depression. By the late 1930s, millions of Americans purchased swing records and tuned into jazz radio programs, including Goodman’s own show, which averaged two million listeners during that period.
And yet, only forty years separated this major popular triumph and the very origins of jazz music. Between 1900 and 1945, American musical culture changed dramatically; new sounds via new technologies came to define the national experience. At the same time, there were massive demographic shifts as black southerners moved to the Midwest and North, and urban culture eclipsed rural life as the norm. America in 1900 was mainly a rural and disconnected nation, defined by regional identities where cultural forms were transmitted through live performances. By the end of World War II, however, a definable national musical culture had emerged, as radio came to link Americans across time and space. Regional cultures blurred as a national culture emerged via radio transmissions, motion picture releases, and phonograph records. The turbulent decade of the 1920s sat at the center of this musical and cultural transformation as American life underwent dramatic changes in the first decades of the 20th century.
In the post-1945 period, jazz moved rapidly from one major avant-garde revolution (the birth of bebop) to another (the emergence of free jazz) while developing a profusion of subgenres (hard bop, progressive, modal, Third Stream, soul jazz) and a new idiomatic persona (cool or hip) that originated as a form of African American resistance but soon became a signature of transgression and authenticity across the modern arts and culture. Jazz’s long-standing affiliation with African American urban life and culture intensified through its central role in the Black Arts Movement of the 1960s. By the 1970s, jazz, now fully eclipsed in popular culture by rock n’ roll, turned to electric instruments and fractured into a multitude of hyphenated styles (jazz-funk, jazz-rock, fusion, Latin jazz). The move away from acoustic performance and traditional codes of blues and swing musicianship generated a neoclassical reaction in the 1980s that coincided with a mission to establish an orthodox jazz canon and honor the music’s history in elite cultural institutions. Post-1980s jazz has been characterized by tension between tradition and innovation, earnest preservation and intrepid exploration, Americanism and internationalism.
Brian D. Behnken
African Americans and Latino/as have had a long history of social interactions that have been strongly affected by the broader sense of race in the United States. Race in the United States has typically been constructed as a binary of black and white. Latino/as do not fit neatly into this binary. Some Latino/as have argued for a white racial identity, which has at times frustrated their relationships with black people. For African Americans and Latino/as, segregation often presented barriers to good working relationships. The two groups were often segregated from each other, making them mutually invisible. This invisibility did not make for good relations.
Latino/as and blacks found new avenues for improving their relationships during the civil rights era, from the 1940s to the 1970s. A number of civil rights protests generated coalitions that brought the two communities together in concerted campaigns. This was especially the case for militant groups such as the Black Panther Party, the Mexican American Brown Berets, and the Puerto Rican Young Lords, as well as in the Poor People’s Campaign. Interactions among African Americans and Mexican American, Puerto Rican, and Cuban/Cuban American illustrate the deep and often convoluted sense of race consciousness in American history, especially during the time of the civil rights movement.
Emily K. Hobson
Since World War II, the United States has witnessed major changes in lesbian, gay, bisexual, transgender, and queer (LGBTQ) politics. Indeed, because the history of LGBTQ activism is almost entirely concentrated in the postwar years, the LGBTQ movement is typically said to have achieved rapid change in a short period of time. But if popular accounts characterize LGBTQ history as a straightforward narrative of progress, the reality is more complex. Postwar LGBTQ politics has been both diverse and divided, marked by differences of identity and ideology. At the same time, LGBTQ politics has been embedded in the contexts of state-building and the Cold War, the New Left and the New Right, the growth of neoliberalism, and the HIV/AIDS epidemic. As the field of LGBTQ history has grown, scholars have increasingly been able to place analyses of state regulation into conversation with community-based histories. Moving between such outside and inside perspectives helps to reveal how multiple modes of LGBTQ politics have shaped one another and how they have been interwoven with broader social change. Looking from the outside, it is apparent that LGBTQ politics has been catalyzed by exclusions from citizenship; from the inside, we can see that activists have responded to such exclusions in different ways, including both by seeking social inclusion and by rejecting assimilationist terms. Court rulings and the administration of law have run alongside the debates inside activist communities. Competing visions for LGBTQ politics have centered around both leftist and liberal agendas, as well as viewpoints shaped by race, gender, gender expression, and class.
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”
Steven A. Riess
Professional sports teams are athletic organizations comprising talented, expert players hired by club owners whose revenues originally derived from admission fees charged to spectators seeing games in enclosed ballparks or indoor arenas. Teams are usually members of a league that schedules a championship season, although independent teams also can arrange their own contests. The first professional baseball teams emerged in the east and Midwest in 1860s, most notably the all-salaried undefeated Cincinnati Red Stockings of 1869. The first league was the haphazardly organized National Association of Professional Base Ball Players (1871), supplanted five years later by the more profit-oriented National League (NL) that set up strict rules for franchise locations, financing, and management–employee relations (including a reserve clause in 1879, which bound players to their original employer), and barred African Americans after 1884. Once the NL prospered, rival major leagues also sprang up, notably the American Association in 1882 and the American League in 1901.
Major League Baseball (MLB) became a model for the professionalization of football, basketball, and hockey, which all had short-lived professional leagues around the turn of the century. The National Football League and the National Hockey League of the 1920s were underfinanced regional operations, and their teams often went out of business, while the National Basketball Association was not even organized until 1949.
Professional team sports gained considerable popularity after World War II. The leagues dealt with such problems as franchise relocations and nationwide expansion, conflicts with interlopers, limiting player salaries, and racial integration. The NFL became the most successful operation by securing rich national television contracts, supplanting baseball as the national pastime in the 1970s. All these leagues became lucrative investments. With the rise of “free agency,” professional team athletes became extremely well paid, currently averaging more than $2 million a year.
Maureen A. Flanagan
The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members.
Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.
Commercialized sexuality became a prominent feature of American urban settings in the early 19th century when young men migrated far from the watchful eyes of family as soldiers and laborers. Concentrated in large populations, and unable to afford the comforts of marriage, these men constituted a reliable pool of customers for women who sold sexual access to their bodies. These women turned to prostitution on a casual or steady basis as a survival strategy in a sex segregated labor market that paid women perilously low wages, or in response to family disruptions such as paternal or spousal abandonment. Prostitution could be profitable and it provided some women with a path towards economic independence, although it brought risks of venereal disease, addiction, violence, harassment by law enforcement, and unintended pregnancy. By mid-century most American cities tolerated red-light districts where brothels thrived as part of the urban sporting culture. Fears that white women were being coerced into prostitution led to the “white slavery” scare of the 1910s, spurring a concerted attack on brothels by progressive reformers. These reformers used the emergency of World War I to close public brothels, pushing America’s sex markets into clandestine spaces and empowering pimps’ control over women’s sexual labor. World War II raised concerns about soldiers’ venereal health that prompted the US military to experiment with different schemes for regulating prostitution that had been developed earlier during the Spanish–American War, as well as in the Philippines and Puerto Rico. After the war, the introduction of antibiotics and the celebration of marriage and family nudged prostitution into the margins of society, where women who sold sex were seen as psychologically deviant, yet men who purchased sex were thought to be sexually liberated. The dawning of second-wave feminism gave birth to the sex workers’ rights movement and a new critique of the criminalization of prostitution. Nevertheless, attitudes about prostitution continue to divide activists, and sex workers still bear the brunt of criminalization.