81-100 of 454 Results

Article

The civil rights movement in the urban South transformed the political, economic, and cultural landscape of post–World War II America. Between 1955 and 1968, African Americans and their white allies relied on nonviolent direct action, political lobbying, litigation, and economic boycotts to dismantle the Jim Crow system. Not all but many of the movement’s most decisive political battles occurred in the cities of Montgomery and Birmingham, Alabama; Nashville and Memphis, Tennessee; Greensboro and Durham, North Carolina; and Atlanta, Georgia. In these and other urban centers, civil rights activists launched full-throttled campaigns against white supremacy, economic exploitation, and state-sanctioned violence against African Americans. Their fight for racial justice coincided with monumental changes in the urban South as the upsurge in federal spending in the region created unprecedented levels of economic prosperity in the newly forged “Sunbelt.” A dynamic and multifaceted movement that encompassed a wide range of political organizations and perspectives, the black freedom struggle proved successful in dismantling legal segregation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 expanded black southerners’ economic, political, and educational opportunities. And yet, many African Americans continued to struggle as they confronted not just the long-term effects of racial discrimination and exclusion but also the new challenges engendered by deindustrialization and urban renewal as well as entrenched patterns of racial segregation in the public-school system.

Article

America’s Civil War became part of a much larger international crisis as European powers, happy to see the experiment in self-government fail in America’s “Great Republic,” took advantage of the situation to reclaim former colonies in the Caribbean and establish a European monarchy in Mexico. Overseas, in addition to their formal diplomatic appeals to European governments, both sides also experimented with public diplomacy campaigns to influence public opinion. Confederate foreign policy sought to win recognition and aid from Europe by offering free trade in cotton and aligning their cause with that of the aristocratic anti-democratic governing classes of Europe. The Union, instead, appealed to liberal, republican sentiment abroad by depicting the war as a trial of democratic government and embracing emancipation of the slaves. The Union victory led to the withdrawal of European empires from the New World: Spain from Santo Domingo, France from Mexico, Russia from Alaska, and Britain from Canada, and the destruction of slavery in the United States hastened its end in Puerto Rico, Cuba, and Brazil.

Article

American cities developed under relatively quiescent climatic conditions. A gradual rise in average global temperatures during the 19th and 20th centuries had a negligible impact on how urban Americans experienced the weather. Much more significant were the dramatic changes in urban form and social organization that meditated the relationship between routine weather fluctuations and the lives of city dwellers. Overcoming weather-related impediments to profit, comfort, and good health contributed to many aspects of urbanization, including population migration to Sunbelt locations, increased reliance on fossil fuels, and comprehensive re-engineering of urban hydrological systems. Other structural shifts such as sprawling development, intensification of the built environment, socioeconomic segregation, and the tight coupling of infrastructural networks were less directly responsive to weather conditions but nonetheless profoundly affected the magnitude and social distribution of weather-related risks. Although fatalities resulting from extreme meteorological events declined in the 20th century, the scale of urban disruption and property damage increased. In addition, social impacts became more concentrated among poorer Americans, including many people of color, as Hurricane Katrina tragically demonstrated in 2005. Through the 20th century, cities responded to weather hazards through improved forecasting and systematic planning for relief and recovery rather than alterations in metropolitan design. In recent decades, however, growing awareness and concern about climate change impacts have made volatile weather more central to urban planning.

Article

Clodagh Harrington

The Clinton scandals have settled in the annals of American political history in the context of the era’s recurrent presidential misbehavior. Viewed through a historical lens, the activities, investigation, and impeachment trial of the forty-second president are almost inevitably measured against the weight of Watergate and Iran-Contra. As a result, the actions and consequences of this high-profile moment in the late-20th-century political history of the United States arguably took on a weightier meaning than it might otherwise have. If Watergate tested the U.S. constitutional system to its limits and Iran-Contra was arguably as grave, the Clinton affair was crisis-light by comparison. Originating with an investigation into a failed 1970s Arkansas land deal by Bill Clinton and his wife, the saga developed to include such meandering subplots as Filegate, Travelgate, Troopergate, the death of White House counsel Vince Foster, and, most infamously, the president’s affair with a White House intern. Unlike Richard Nixon and Ronald Reagan, even Bill Clinton’s most ardent critics could not find a national security threat among the myriad scandals linked to his name. By the time that Justice Department appointee Robert Fiske was replaced as prosecutor by the infinitely more zealous Kenneth Starr, the case had become synonymous with the culture wars that permeated 1990s American society. As the Whitewater and related tentacles of the investigation failed to result in any meaningfully negative impact on the president, it was his marital infidelities that came closest to unseating him. Pursued with vigor by the Independent Counsel, his supporters remained loyal as his detractors spotted political opportunity via his lapses in judgment. Certain key factors made the Clinton scandal particular to its era. First, in an unprecedented development, the personal indiscretion aspect of the story broke via the Internet. In addition, had the Independent Counsel legislation not been renewed, prosecutor Fiske would likely have wrapped up his investigation in a timely fashion with no intention of pursuing an impeachment path. And, the relentless cable news cycle and increasingly febrile partisan atmosphere of the decade ensured that the nation remained as focused as it was divided on the topic.

Article

Communist activists took a strong interest in American trade unions from the 1920s through the 1950s and played an important role in shaping the nature of the American union movement. Initial communist trade union activism drew upon radical labor traditions that preceded the formation of the American Communist Party (CPUSA). Early communist trade unionists experimented with different types of structures to organize unorganized workers. They also struggled with international communist factionalism. Communist trade unionists were most effective during the Great Depression and World War II. In those years, communist activists helped build the Congress of Industrial Organizations (CIO) and bring industrial unionism to previously unorganized workers. Throughout the history of communist involvement in the US labor movement, international communist policy guided general organizing strategies. Shifts in international policy, such as the announcement of a Soviet non-aggression pact with Germany, proved politically difficult to navigate on the local level. Yet, Left-led unions proved to be more democratically run and focused on racial and gender equality than many of those without communist influence. Their leadership supported social justice and militant action. The Cold War years witnessed CIO purges of Left-led unions and federal investigations and arrests of communist trade unionists. Repression from both within and without the labor movement as well as the CPUSA’s own internal policy battles ultimately ended communist trade unionists’ widespread influence on American trade unions.

Article

James R. Barrett

The largest and most important revolutionary socialist organization in US history, the Communist Party USA was always a minority influence. It reached considerable size and influence, however, during the Great Depression and World War II years when it followed the more open line associated with the term “Popular Front.” In these years communists were much more flexible in their strategies and relations with other groups, though the party remained a hierarchical vanguard organization. It grew from a largely isolated sect dominated by unskilled and unemployed immigrant men in the 1920s to a socially diverse movement of nearly 100,000 based heavily on American born men and women from the working and professional classes by the late 1930s and during World War II, exerting considerable influence in the labor movement and American cultural life. In these years, the Communist Party helped to build the industrial union movement, advanced the cause of African American civil rights, and laid the foundation for the postwar feminist movement. But the party was always prone to abrupt changes in line and vulnerable to attack as a sinister outside force because of its close adherence to Soviet policies and goals. Several factors contributed to its catastrophic decline in the 1950s: the increasingly antagonistic Cold War struggle between the Soviet Union and the United States; an unprecedented attack from employers and government at various levels—criminal cases and imprisonment, deportation, and blacklisting; and within the party itself, a turn back toward a more dogmatic version of Marxism-Leninism and a heightened atmosphere of factional conflict and purges.

Article

Company towns can be defined as communities dominated by a single company, typically focused on one industry. Beyond that very basic definition, company towns varied in their essentials. Some were purpose-built by companies, often in remote areas convenient to needed natural resources. There, workers were often required to live in company-owned housing as a condition of employment. Others began as small towns with privately owned housing, usually expanding alongside a growing hometown corporation. Residences were shoddy in some company towns. In others, company-built housing may have been excellent, with indoor plumbing and central heating, and located close to such amenities as schools, libraries, perhaps even theaters. Company towns played a key role in US economic and social development. Such places can be found across the globe, but America’s vast expanse of undeveloped land, generous stock of natural resources, tradition of social experimentation, and laissez-faire attitude toward business provided singular opportunities for the emergence of such towns, large and small, in many regions of the United States. Historians have identified as many as 2,500 such places. A tour of company towns can serve as a survey of the country’s industrial development, from the first large-scale planned industrial community—the textile town of Lowell, Massachusetts—to Appalachian mining villages, Western lumber towns, and steelmaking principalities such as the mammoth development at Gary, Indiana. More recent office-park and high-tech industrial-park complexes probably do not qualify as company towns, although they have some similar attributes. Nor do such planned towns as Disney Corporation’s Celebration, Florida, qualify, despite close ties to a single corporation, because its residents do not necessarily work for Disney. Company towns have generally tended toward one of two models. First, and perhaps most familiar, are total institutions—communities where one business exerts a Big Brother–ish grip over the population, controlling or even taking the place of government, collecting rent on company-owned housing, dictating buying habits (possibly at the company store), and even directing where people worship and how they may spend their leisure time. A second form consists of model towns—planned, ideal communities backed by companies that promised to share their bounty with workers and families. Several such places were carefully put together by experienced architects and urban planners. Such model company towns were marked by a paternalistic, watchful attitude toward the citizenry on the part of the company overlords.

Article

Lise Namikas

At the dawn of the 20th century, the region that would become the Democratic Republic of Congo fell to the brutal colonialism of Belgium’s King Leopold. Except for a brief moment when anti-imperialists decried the crimes of plantation slavery, the United States paid little attention to Congo before 1960. But after winning its independence from Belgium in June 1960, Congo suddenly became engulfed in a crisis of decolonization and the Cold War, a time when the United States and the Soviet Union competed for resources and influence. The confrontation in Congo was kept limited by a United Nations (UN) peacekeeping force, which ended the secession of the province of Katanga in 1964. At the same time, the CIA (Central Intelligence Agency) intervened to help create a pro-Western government and eliminate the Congo’s first prime minister, Patrice Lumumba. Ironically, the result would be a growing reliance on the dictatorship of Joseph Mobutu throughout the 1980s. In 1997 a rebellion succeeded in toppling Mobutu from power. Since 2001 President Joseph Kabila has ruled Congo. The United States has supported long-term social and economic growth but has kept its distance while watching Kabila fight internal opponents and insurgents in the east. A UN peacekeeping force returned to Congo and helped limit unrest. Despite serving out two full terms that ended in 2016, Kabila was slow to call elections amid rising turmoil.

Article

Foreign relations under the US Constitution starts with the paradox, also seen in domestic matters, of relatively scant text providing guidance for the exercise of vast power. Founding understandings, structural inference, and ongoing constitutional custom and precedent have filled in much, though hardly all, of the framework over the course of two hundred years. As a result, two basic questions frame the relationship between the Constitution and US foreign policy: (1) which parts of the US government, alone or in combination, properly exercise authority in the making of foreign policy; and (2) once made, what is the status of the nation’s international legal obligations in the US domestic legal system. The making of American foreign policy is framed by the Constitution’s commitment to separation of powers. Congress, the president, and the courts are all allocated discrete yet significant foreign affairs authority. Determining the exact borders and overlaps in areas such as the use of military force, emergency measures, and treaty termination continues to generate controversy. The status of international law in the US legal system in the first instance turns on whether resulting obligations derive from agreements or custom. The United States enters into international agreements in three ways: treaties, congressional-executive agreements, and sole executive agreements. Complex doctrine deals with the domestic applicability of treaties in particular. US courts primarily apply customary international law in two basic ways. They can exercise a version of their common lawmaking authority to fashion rules of decision based on international custom. They also apply customary international law when incorporated into domestic law by statute.

Article

Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.

Article

The United States often views itself as a nation of immigrants. This may in part be why since the early 20th century the country has seldom adopted major changes in its immigration policy. Until 1986, only the 1924 National Origins Quota Act, its dismantlement in the 1952 McCarran-Walter Act, and the 1965 Immigration and Nationality Act, also known as the Hart-Celler Act, involved far-reaching reforms. Another large shift occurred with the passage of the 1986 Immigration Reform and Control Act (IRCA) and its derivative sequel, the 1990 Immigration Act. No major immigration legislation has yet won congressional approval in the 21st century. IRCA emerged from and followed in considerable measure the recommendations of the Select Commission on Immigration and Refugee Policy (1979–1981). That body sought to reconcile two competing political constituencies, one favoring the restriction of immigration, or at least unauthorized immigration, and the other an expansion of family-based and work-related migration. The IRCA legislation contained something for each side: the passage of employer sanctions, or serious penalties on employers for hiring unauthorized workers, for the restriction side; and the provision of a legalization program, which outlined a pathway for certain unauthorized entrants to obtain green cards and eventually citizenship, for the reform side. The complete legislative package also included other provisions: including criteria allowing the admission of agricultural workers, a measure providing financial assistance to states for the costs they would incur from migrants legalizing, a requirement that states develop ways to verify that migrants were eligible for welfare benefits, and a provision providing substantial boosts in funding for border enforcement activities. In the years after the enactment of IRCA, research has revealed that the two major compromise provisions, together with the agricultural workers provision, generated mixed results. Employer sanctions failed to curtail unauthorized migration much, in all likelihood because of minimal funding for enforcement, while legalization and the agricultural measures resulted in widespread enrollment, with almost all of the unauthorized migrants who qualified coming forward to take advantage of the opportunity to become U.S. legalized permanent residents (LPRs). But when the agricultural workers provisions allowing entry of temporary workers are juxtaposed with the relatively unenforceable employer-sanctions provisions, IRCA entailed contradictory elements that created frustration for some observers. In sociocultural, political, and historical terms, scholars and others can interpret IRCA’s legalization as reflecting the inclusive, pluralistic, and expansionist tendencies characteristic of much of 18th-century U.S. immigration. But some of IRCA’s other elements led to contradictory effects, with restriction efforts being offset by the allowances for more temporary workers. This helped to spawn subsequent political pressures in favor of new restrictive or exclusive immigration controls that created serious hazards for immigrants.

Article

In May 1861, three enslaved men who were determined not to be separated from their families ran to Fort Monroe, Virginia. Their flight led to the phenomenon of Civil War contraband camps. Contraband camps were refugee camps to which between four hundred thousand and five hundred thousand enslaved men, women, and children in the Union-occupied portions of the Confederacy fled to escape their owners by getting themselves to the Union Army. Army personnel had not envisioned overseeing a massive network of refugee camps. Responding to the interplay between the actions of the former slaves who fled to the camps, Republican legislation and policy, military orders, and real conditions on the ground, the army improvised. In the contraband camps, former slaves endured overcrowding, food and clothing shortages, poor sanitary conditions, and constant danger. They also gained the protection of the Union Army and access to the power of the US government as new, though unsteady, allies in the pursuit of their key interests, including education, employment, and the reconstitution of family, kin, and social life. The camps brought together actors who had previously had little to no contact with each other, exposed everyone involved to massive structural forces that were much larger than the human ability to control them, and led to unexpected outcomes. They produced a refugee crisis on US soil, affected the course and outcome of the Civil War, influenced the progress of wartime emancipation, and altered the relationship between the individual and the national government. Contraband camps were simultaneously humanitarian crises and incubators for a new relationship between African Americans and the US government.

Article

American history is replete with instances of counterinsurgency. An unsurprising reality considering the United States has always participated in empire building, thus the need to pacify resistance to expansion. For much of its existence, the U.S. has relied on its Army to pacify insurgents. While the U.S. Army used traditional military formations and use of technology to battle peer enemies, the same strategy did not succeed against opponents who relied on speed and surprise. Indeed, in several instances, insurgents sought to fight the U.S. Army on terms that rendered superior manpower and technology irrelevant. By introducing counterinsurgency as a strategy, the U.S. Army attempted to identify and neutralize insurgents and the infrastructure that supported them. Discussions of counterinsurgency include complex terms, thus readers are provided with simplified, yet accurate definitions and explanations. Moreover, understanding the relevant terms provided continuity between conflicts. While certain counterinsurgency measures worked during the American Civil War, the Indian Wars, and in the Philippines, the concept failed during the Vietnam War. The complexities of counterinsurgency require readers to familiarize themselves with its history, relevant scholarship, and terminology—in particular, counterinsurgency, pacification, and infrastructure.

Article

Andrew Frank

The Creek Confederacy was a loose coalition of ethnically and linguistically diverse Native American towns that slowly coalesced as a political entity in the 18th and early 19th centuries. Its towns existed in Georgia, Alabama, and northern Florida, and for most of its preremoval history, these towns operated as autonomous entities. Several Creek leaders tried to consolidate power and create a more centralized polity, but these attempts at nation building largely failed. Instead, a fragile and informal confederacy connected the towns together for various cultural rituals as well as for purposes of diplomacy and trade. Disputes over centralization, as well as a host of other connected issues, ultimately led to the Creek War of 1813–1814. In the 1830s, the United States forced most members of the Creek Confederacy to vacate their eastern lands and relocate their nation to Indian Territory. Today, their western descendants are known as the Muskogee (Creek) Nation. Those who remained in the east include members of the federally recognized Seminole Tribe of Florida and the Poarch Band of Creek Indians who live in Alabama.

Article

Michael J. Bustamante

The Cuban Revolution transformed the largest island nation of the Caribbean into a flashpoint of the Cold War. After overthrowing US-backed ruler Fulgencio Batista in early 1959, Fidel Castro established a socialist, anti-imperialist government that defied the island’s history as a dependent and dependable ally of the United States. But the Cuban Revolution is not only significant for its challenge to US interests and foreign policy prerogatives. For Cubans, it fundamentally reordered their lives, inspiring multitudes yet also driving thousands of others to migrate to Miami and other points north. Sixty years later, Fidel Castro may be dead and the Soviet Union may be long gone. Cuban socialism has become more hybrid in economic structure, and in 2014 the Cuban and US governments moved to restore diplomatic ties. But Cuba’s leaders continue to insist that “the Revolution,” far from a terminal political event, is still alive. Today, as the founding generation of Cuban leaders passes from the scene, “the Revolution” faces another important crossroads of uncertainty and reform.

Article

Distinctive patterns of daily life defined the Jim Crow South. Contrary to many observers’ emphasis on de jure segregation—meaning racial separation demanded by law—neither law nor the physical separation of blacks and whites was at the center of the early 20th-century South’s social system. Instead, separation, whether by law or custom, was one of multiple tools whites used to subordinate and exclude blacks and to maintain notions of white racial purity. In turn, these notions themselves varied over time and across jurisdictions, at least in their details, as elites tried repeatedly to establish who was “white,” who was “black,” and how the legal fictions they created would apply to Native Americans and others who fit neither category. Within this complex multiracial world of the South, whites’ fundamental commitment to keeping blacks “in their place” manifested most routinely in day-to-day social dramas, often described in terms of racial “etiquette.” The black “place” in question was socially but not always physically distant from whites, and the increasing number of separate, racially marked spaces and actual Jim Crow laws was a development over time that became most pronounced in urban areas. It was a development that reveals blacks’ determination to resist racial oppression and whites’ perceived need to shore up a supposedly natural order that had, in fact, always been enforced by violence as well as political and economic power. Black resistance took many forms, from individual, covert acts of defiance to organized political movements. Whether in response to African Americans’ continued efforts to vote or their early 20th-century boycotts of segregated streetcars or World War I-era patterns of migration that threatened to deplete the agricultural labor force, whites found ways to counter blacks’ demands for equal citizenship and economic opportunity whenever and wherever they appeared. In the rural South, where the majority of black Southerners remained economically dependent on white landowners, a “culture of personalism” characterized daily life within a paternalistic model of white supremacy that was markedly different from urban—and largely national, not merely southern—racial patterns. Thus, distinctions between rural and urban areas and issues of age and gender are critical to understanding the Jim Crow South. Although schools were rigorously segregated, preadolescent children could be allowed greater interracial intimacy in less official settings. Puberty became a break point after which close contact, especially between black males and white females, was prohibited. All told, Jim Crow was an inconsistent and uneven system of racial distinction and separation whose great reach shaped the South’s landscape and the lives of all Southerners, including those who were neither black nor white.

Article

Frederick Rowe Davis

The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.

Article

Death is universal yet is experienced in culturally specific ways. Because of this, when individuals in colonial North America encountered others from different cultural backgrounds, they were curious about how unfamiliar mortuary practices resembled and differed from their own. This curiosity spawned communication across cultural boundaries. The resulting knowledge sometimes facilitated peaceful relations between groups, while at other times it helped one group dominate another. Colonial North Americans endured disastrously high mortality rates caused by disease, warfare, and labor exploitation. At the same time, death was central to the religions of all residents: Indians, Africans, and Europeans. Deathways thus offer an unmatched way to understand the colonial encounter from the participants’ perspectives.

Article

The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded. Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.

Article

The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.